Changes

1,210 bytes added ,  18:21, 4 December 2014
no edit summary
Line 88: Line 88:     
Abstract: In this work, we propose a novel node splitting method for regression trees and incorporate it into the regression forest framework. Unlike traditional binary splitting, where the splitting rule is selected from a predefined set of binary splitting rules via trial-and-error, the proposed node splitting method first finds clusters of the training data which at least locally minimize the empirical loss without considering the input space. Then splitting rules which preserve the found clusters as much as possible are determined by casting the problem into a classification problem. Consequently, our new node splitting method enjoys more freedom in choosing the splitting rules, resulting in more efficient tree structures. In addition to the Euclidean target space, we present a variant which can naturally deal with a circular target space by the proper use of circular statistics. We apply the regression forest employing our node splitting to head pose estimation (Euclidean target space) and car direction estimation (circular target space) and demonstrate that the proposed method significantly outperforms state-of-the-art methods (38.5\% and 22.5\% error reduction respectively).
 
Abstract: In this work, we propose a novel node splitting method for regression trees and incorporate it into the regression forest framework. Unlike traditional binary splitting, where the splitting rule is selected from a predefined set of binary splitting rules via trial-and-error, the proposed node splitting method first finds clusters of the training data which at least locally minimize the empirical loss without considering the input space. Then splitting rules which preserve the found clusters as much as possible are determined by casting the problem into a classification problem. Consequently, our new node splitting method enjoys more freedom in choosing the splitting rules, resulting in more efficient tree structures. In addition to the Euclidean target space, we present a variant which can naturally deal with a circular target space by the proper use of circular statistics. We apply the regression forest employing our node splitting to head pose estimation (Euclidean target space) and car direction estimation (circular target space) and demonstrate that the proposed method significantly outperforms state-of-the-art methods (38.5\% and 22.5\% error reduction respectively).
 +
 +
 +
===Locally Convolutional Neural Network===
 +
Speaker: [http://www.umiacs.umd.edu/~kanazawa/] -- Date: December 4, 2014
 +
 +
Abstract: Convolutional Neural Networks (ConvNets) have shown excellent results on many
 +
visual classification tasks. With the exception of ImageNet, these datasets are
 +
carefully crafted such that objects are well-aligned at similar scales. Naturally, the
 +
feature learning problem gets more challenging as the amount of variation in the
 +
data increases, as the models have to learn to be invariant to certain changes in
 +
appearance. Recent results on the ImageNet dataset show that given enough data,
 +
ConvNets can learn such invariances producing very discriminative features [1].
 +
But could we do more: use less parameters, less data, learn more discriminative
 +
features, if certain invariances were built into the learning process? In this paper
 +
we present a simple model that allows ConvNets to learn features in a locally
 +
scale-invariant manner without increasing the number of model parameters. We
 +
show on a modified MNIST dataset that when faced with scale variation, building
 +
in scale-invariance allows ConvNets to learn more discriminative features with
 +
reduced chances of over-fitting.
    
==Past Semesters==
 
==Past Semesters==
77

edits