Skip to content

Regularized interval regression

Toby Dylan Hocking edited this page Jan 28, 2016 · 49 revisions

Background

Interval regression is a class of machine learning models which is useful when predicted values should be real numbers, but outputs in the training data set may be partially observed. A common example is survival analysis, in which data are patient survival times.

For example, say that Alice and Bob came into the hospital and were treated for cancer on the same day in 2000. Now we are in 2016 and we would like to study the treatment efficacy. Say Alice died in 2010, and Bob is still alive. The survival time for Alice is 10 years, and although we do not know Bob’s survival time, we know it is in the interval (16, Infinity).

Say that we also measured some covariates (input variables) for Alice and Bob (age, sex, gene expression). We can fit an Accelerated Failure Time (AFT) model which takes those input variables and outputs a predicted survival time. L1 regularized AFT models are of interest when there are many input variables and we would like the model to automatically ignore those which are un-informative (do not help predicting survival time). Several papers describe L1 regularized AFT models:

Interval regression (or interval censoring) is a generalization in which any kind of interval is an acceptable output in the training data. Any real-valued or positive-valued probability distribution may be used to model the outputs (e.g. normal or logistic if output is real-valued, log-normal or log-logistic if output is positive-valued like a survival time). For more details read this 1-page explanation of un-regularized AFT models.

output interval likelihood censoring
exactly 10 (10, 10) density function none
at least 16 (16, Infinity) cumulative distribution function right
at most 3 (-Infinity, 3) cumulative distribution function left
between -4 and 5 (-4, 5) cumulative distribution function interval

Related R packages

  • AdapEnetClass::WEnetCC.aft (arXiv paper) fits a model with AFT loss and elastic net regularization.
  • glmnet fits models for elastic net regularization with several loss functions, but neither AFT nor interval regression losses are supported.
  • interval::icfit and survival::survreg provide solvers for non-regularized interval regression models.
  • The PeakSegDP package contains a solver which uses the FISTA algorithm to fit an L1 regularized model for general interval output data. However, there are two issues: (1) it is not as fast as the coordinate descent algorithm implmented in glmnet, and (2) it does not support L2-regularization.

Coding project: iregnet package

Implement the first R package to support

  • general interval output data (including left and interval censoring; not just observed and right-censored data typical of survival analysis),
  • elastic net (L1 + L2) regularization, and
  • a fast glmnet-like coordinate descent solver.
function/pkg censoring regularization loss algorithm
glmnet none, right L1 + L2 Cox coordinate descent
glmnet none L1 + L2 normal, logistic coordinate descent
AdapEnetClass none, right L1 + L2 normal LARS
coxph none, right, left, interval none Cox ?
survreg none, right, left, interval none normal, logistic, Weibull ?
PeakSegDP left, right, interval L1 squared hinge, log FISTA
THIS PROJECT none, right, left, interval L1 + L2 normal, logistic, Weibull coordinate descent

There are two possible coding strategies

  • Fork the solver from the glmnet source code and adapt it to work with the interval regression loss. Should be possible if you understand their FORTRAN code.
  • Read Simon et al (JSS) and implement a coordinate descent solver from scratch in C code. (This coding strategy is preferred)

Project goals for the end of summer:

  • Implement the Log-Normal loss for general interval outputs, and at least one other of the following AFT loss functions: Log-Logistic, exponential, Weibull.
  • Package tests to make sure the global optimum is found in real data sets such as AdapEnetClass::MCLcleaned.
  • Interface similar to glmnet. Outputs y should be a two column matrix (first column real-valued lower limit, possibly negative or -Inf; second column real-valued upper limit, possibly Inf).
iregnet(X, y, 
        family=c("weibull", "exponential", "lognormal", "loglogistic"),
        alpha=1)

Would be nice, but not necessary before the end of summer:

  • Vignette which compares computation time and solution accuracy with glmnet, survreg, icfit, and/or AdapEnetClass::WEnetCC.aft.
  • Interface which accepts survival::Surv objects as outputs y, for compatibility with survreg.
  • Optimizations for sparse input matrices (Matrix package).

Mentors

  • Toby Dylan Hocking <[email protected]> proposed this project, would be a user of this package, and could mentor.
  • Noah Simon <[email protected]> implemented the elastic net for the Cox model, and said he could help out informally, but he can NOT commit to formal co-mentoring.
  • Need a co-mentor with experience implementing convex optimization algorithms! Students, if you are interested in this project, then you need to find another mentor! Maybe email the authors of the articles referenced above? Terry M Therneau <[email protected]> maintains survival, Trevor Hastie <[email protected]> maintains glmnet.

Tests

Students, please complete as many tests as possible before emailing the mentors. If we do not find a student who can complete the Hard test, then we should not approve this GSOC project.

  • Easy: write a knitr document in which you perform cross-validation to compare predictions from WEnetCC.aft/survreg models. Divide the AdapEnetClass::MCLcleaned data into train/test, then fit models to the train set, and compute error of each model with respect to the test set. Which model is most accurate?
  • Medium: show that you know how to include FORTRAN/C code in an R package.
  • Hard: write down the mathematical optimization problem for elastic net regularized interval regression using the loss function which corresponds to a log-logistic AFT model. Output data in the train set can be any of the four censoring types described above (none, left, right, interval). Write the subdifferential optimality condition for this optimization problem. Using the arguments similar to the glmnet/coxnet papers, derive the coordinate descent update rule and a stopping criterion.

Solutions of tests

Students, please post a link to your test results here.

Clone this wiki locally