Anonymous

Changes

From cvss
11 bytes added ,  16:59, 6 March 2015
no edit summary
Line 39: Line 39:  
|-
 
|-
 
| March 5
 
| March 5
 +
| "Snow Break"
 +
|
 +
|-
 +
| March 12
 
| Yezhou Yang
 
| Yezhou Yang
 
| Grasp Type Revisited: A Modern Perspective on A Classical Feature for Vision and Robotics
 
| Grasp Type Revisited: A Modern Perspective on A Classical Feature for Vision and Robotics
|-
  −
| March 12
  −
| Bahadir Ozdemir
  −
| TBD
   
|-
 
|-
 
| March 19
 
| March 19
Line 55: Line 55:  
|-
 
|-
 
| April 2
 
| April 2
| Joe Ng
+
| Bahadir Ozdemir
 
| TBD
 
| TBD
 
|-
 
|-
Line 71: Line 71:  
|-
 
|-
 
| April 30
 
| April 30
| Aleksandr (?)
+
| Joe Ng
 
| TBD
 
| TBD
 
|-
 
|-
 
| May 7
 
| May 7
| Francisco (?)
+
| Aleksandr(?), Francisco (?)
 
| TBD
 
| TBD
 
|-
 
|-
Line 96: Line 96:     
===Grasp Type Revisited: A Modern Perspective on A Classical Feature for Vision and Robotics===
 
===Grasp Type Revisited: A Modern Perspective on A Classical Feature for Vision and Robotics===
Speaker: [http://www.umiacs.umd.edu/~yzyang/ Yezhou Yang] -- Date: March 5, 2015
+
Speaker: [http://www.umiacs.umd.edu/~yzyang/ Yezhou Yang] -- Date: March 13, 2015
    
Abstract: Our ability to interpret other people's actions hinges crucially on predictions about their intentionality. The grasp type provides crucial information about human action. However, recognizing the grasp type from unconstrained scenes is challenging because of the large variations in appearance, occlusions and  geometric distortions. In this paper, first we present a convolutional neural network to classify functional hand grasp types. Experiments on a public static scene hand data set validate good performance of the presented method. Then we present two applications utilizing grasp type classification: (a) inference of human action intention and (b) fine level manipulation action segmentation.
 
Abstract: Our ability to interpret other people's actions hinges crucially on predictions about their intentionality. The grasp type provides crucial information about human action. However, recognizing the grasp type from unconstrained scenes is challenging because of the large variations in appearance, occlusions and  geometric distortions. In this paper, first we present a convolutional neural network to classify functional hand grasp types. Experiments on a public static scene hand data set validate good performance of the presented method. Then we present two applications utilizing grasp type classification: (a) inference of human action intention and (b) fine level manipulation action segmentation.
77

edits