Changes

No change in size ,  03:33, 31 January 2013
Line 132: Line 132:  
Speaker: [http://www.umiacs.umd.edu/~ejaz/ Ejaz Ahmed] -- Date: December 6, 2012
 
Speaker: [http://www.umiacs.umd.edu/~ejaz/ Ejaz Ahmed] -- Date: December 6, 2012
   −
I'll talk about linear methods for dimensionality reduction, emphasizing on PLS and CDF (composite discriminant factors). Focus on linear methods is motivated by the task of object detection which can be benefited from linear projections in various ways. I'll go through some of these issues as well.  
+
I'll talk about linear methods for dimensionality reduction, emphasizing on PLS and CDF (composite discriminant factors). Focus on linear methods is motivated by the task of object detection which can be benefited from linear projections in various ways. I'll go through some of these issues as well.
 +
 
 
We propose a linear dimensionality reduction method, Composite Discriminant Factor (CDF) analysis, which searches for a discriminative but compact feature subspace that can be used as input to classifiers that suffer from problems such as multi-collinearity or the curse of dimensionality.  The subspace selected by CDF maximizes the performance of the entire classification pipeline, and is chosen from a set of candidate subspaces that are each discriminative by various local measures, such as covariance between input features and output labels or the margin between positive and negative samples.  Our method is based on Partial Least Squares (PLS) analysis, and can be viewed as a generalization of the PLS1 algorithm, designed to increase discrimination in classification tasks.  While our experiments focus on improvements to object detection (in particular, pedestrians and vehicles), a task that often involves high dimensional features and benefits from fast linear approaches, we also demonstrate our approach on machine learning datasets from the UCI Machine Learning repository.  Experimental results show that the proposed approach improves significantly over SVM in terms of accuracy and also over PLS in terms of compactness and efficiency, while maintaining or slightly improving accuracy.
 
We propose a linear dimensionality reduction method, Composite Discriminant Factor (CDF) analysis, which searches for a discriminative but compact feature subspace that can be used as input to classifiers that suffer from problems such as multi-collinearity or the curse of dimensionality.  The subspace selected by CDF maximizes the performance of the entire classification pipeline, and is chosen from a set of candidate subspaces that are each discriminative by various local measures, such as covariance between input features and output labels or the margin between positive and negative samples.  Our method is based on Partial Least Squares (PLS) analysis, and can be viewed as a generalization of the PLS1 algorithm, designed to increase discrimination in classification tasks.  While our experiments focus on improvements to object detection (in particular, pedestrians and vehicles), a task that often involves high dimensional features and benefits from fast linear approaches, we also demonstrate our approach on machine learning datasets from the UCI Machine Learning repository.  Experimental results show that the proposed approach improves significantly over SVM in terms of accuracy and also over PLS in terms of compactness and efficiency, while maintaining or slightly improving accuracy.
  
50

edits