Department of
presents
Dr. R. Dennis Cook
Dimension Reduction Paradigms for
Regression
ABSTRACT
Dimension reduction for regression, represented primarily by principal components, is ubiquitous in the applied sciences. This is an old idea that has moved to a position of prominence in recent years because technological advances now allow scientists to routinely formulate regressions in which the number p of predictors is considerably larger than in the past. Although ``large" p regressions are perhaps mainly responsible for renewed interest, dimension reduction methodology can be useful regardless of the size of p.
Starting with a little history and a definition of ``sufficient reductions", we will consider a variety of models for dimension reduction in regression. The models start from one in which maximum likelihood estimation produces principal components, step along a few incremental expansions, and end with forms that have the potential to improve on some standard methodology. This development provides remedies for two concerns that have dogged principal components in regression: principal components are typically computed from the predictors alone and then do not make apparent use of the response, and they are not equivariant under full rank linear transformation of the predictors.
Friday, November 16, 2007
3:35 - 4:35 pm
301 Riddick Hall
Refreshments will be served in the common area of 301 Riddick at
3:00 pm