Title: "Fitting High-Dimensional Generalized Linear Models via Supervised Dimension Reduction"
Speaker: Yanzhu Lin, Department of Statistics, Purdue University
Place: HORT 117; April 5, 2011, Tuesday, 4:30pm

Abstract

In order to deal with the generalized linear model (GLM) with high-dimensional data, we propose a strategy to implement the supervised dimension reduction idea in partial least squares (PLS) to fit high-dimensional GLMs. We intend to build up generalized orthogonal-component regressions (GOCRE) for GLMs. Unlike the existing methods based on the extension of PLS to categorical data, we sequentially construct orthogonal predictors (SCOP) and each orthogonal predictor is the resultant of convergence construction. The bias correction procedure by Firth (1993) is also applied. In order to implement the dimension reduction and the selection of a small set of relevant variables among the large number of predictors simultaneously, we develop Sparse-GOCRE by incorporating a penalized approach into GOCRE framework. Within the sequentially construction of components in the framework of GOCRE, a penalized approach is used to identify the sparse predictors for each component. Two different penalized strategies are considered. Both simulated and real data examples are presented to illustrate competitive performance of the new approaches and compare them with several other existing methods.

***This is joint work with Professor Dabao Zhang and Professor Min Zhang (Department of Statistics, Purdue University)

Associated Reading:
Dabao Zhang, Yanzhu Lin, Min Zhang. 2009. Penalized orthogonal-components regression for large p small n data. Electronic Journal of Statistics. Vol. 3 (2009) 781-796.



Click here for a full schedule of BIOINFORMATICS SEMINARS, past and present.