LassoCMA {CMA} | R Documentation |
The Lasso (Tibshirani, 1996) is one of the most popular
tools for simultaneous shrinkage and variable selection. Recently,
Young-Park and Hastie (2007) have developped and algorithm to
compute the entire solution path of the Lasso for an arbitrary
generalized linear model, implemented in the package glmpath
.
The method can be used for variable selection alone, s. GeneSelection
.
For S4
method information, see LassoCMA-methods
.
LassoCMA(X, y, f, learnind, norm.fraction = 0.1, ...)
X |
Gene expression data. Can be one of the following:
|
y |
Class labels. Can be one of the following:
0 to K-1 , where K is the
total number of different classes in the learning set.
|
f |
A two-sided formula, if X is a data.frame . The
left part correspond to class labels, the right to variables. |
learnind |
An index vector specifying the observations that
belong to the learning set. May be missing ;
in that case, the learning set consists of all
observations and predictions are made on the
learning set. |
norm.fraction |
L1 Shrinkage intensity, expressed as the fraction
of the coefficient L1 norm compared to the
maximum possible L1 norm (corresponds to fraction = 1 ).
Lower values correspond to higher shrinkage.
Note that the default (0.1) need not produce good
results, i.e. tuning of this parameter is recommended. |
... |
Further arguments passed to the function glmpath
from the package of the same name. |
An object of class clvarseloutput
.
For a strongly related method, s. ElasticNetCMA
.
Up to now, this method can only be applied to binary classification.
Martin Slawski martin.slawski@campus.lmu.de
Anne-Laure Boulesteix http://www.slcmsr.net/boulesteix
Tibshirani, R. (1996)
Regression shrinkage and selection via the lasso.
Journal of the Royal Statistical Society B, 58(1), 267-288
Young-Park, M., Hastie, T. (2007)
L1-regularization path algorithm for generalized linear models.
Journal of the Royal Statistical Society B, 69(4), 659-677
compBoostCMA
, dldaCMA
, ElasticNetCMA
,
fdaCMA
, flexdaCMA
, gbmCMA
,
knnCMA
, ldaCMA
,
nnetCMA
, pknnCMA
, plrCMA
,
pls_ldaCMA
, pls_lrCMA
, pls_rfCMA
,
pnnCMA
, qdaCMA
, rfCMA
,
scdaCMA
, shrinkldaCMA
, svmCMA
### load Golub AML/ALL data data(golub) ### extract class labels golubY <- golub[,1] ### extract gene expression golubX <- as.matrix(golub[,-1]) ### select learningset ratio <- 2/3 set.seed(111) learnind <- sample(length(golubY), size=floor(ratio*length(golubY))) ### run L1 penalized logistic regression (no tuning) lassoresult <- LassoCMA(X=golubX, y=golubY, learnind=learnind, norm.fraction = 0.2) show(lassoresult) ftable(lassoresult) plot(lassoresult)