This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
research:projects:3rdhand:ibkpointingdataset [2015/12/13 16:06] c7031078 |
research:projects:3rdhand:ibkpointingdataset [2016/02/23 14:16] c703101 |
||
---|---|---|---|
Line 1: | Line 1: | ||
- | ==== Innsbruck Pointing Dataset ==== | + | [[datasets:ipo|Innsbruck Pointing at Objects Dataset]] |
- | + | ||
- | Deictic gestures – pointing at things in human-human collaborative tasks – constitute a pervasive, non-verbal | + | |
- | way of communication, used e.g. to direct attention towards objects of interest. In a human-robot interactive scenario, in order to delegate tasks from a human to a robot, one of the key requirements is to recognize and estimate the pose of the pointing gesture. | + | |
- | + | ||
- | **Dataset Features** | + | |
- | + | ||
- | * Two types of pointing gestures: (1) Natural pointing with index finger, and (2) Tool pointing with white board marker. | + | |
- | * 9 participants pointing at 10 objects performing both the types of pointing gestures. | + | |
- | * Pointing gestures recorded with RGB-D with Kinect sensor. | + | |
- | * 180 RGB-D test images available with the ground truth to evaluate 3D pointing direction. | + | |
- | * Publicly available to [[https://iis.uibk.ac.at/public/datasets/ibkpointingdataset/ibkpointingdataset.zip|Download]]. | + | |
- | + | ||
- | **Sample Images** | + | |
- | + | ||
- | * Natural pointing | + | |
- | * Tool pointing | + |