Abstract
Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σk] of a group Gk in terms of its eigenvalue decomposition Σk = λkDkAkDk′, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes orientations λk, Ak, and Dk lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and real data show favorable behavior of this approach compared to RDA.
| Original language | English |
|---|---|
| Pages (from-to) | 1743-1748 |
| Number of pages | 6 |
| Journal | Journal of the American Statistical Association |
| Volume | 91 |
| Issue number | 436 |
| DOIs | |
| Publication status | Published - 1 Dec 1996 |
| Externally published | Yes |
Keywords
- Covariance matrix
- Maximum likelihood
- Normal-based classification
- Spectral decomposition