Intelligent and Interactive Systems

User Tools

Site Tools


datasets:imhg

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

datasets:imhg [2016/02/23 14:11]
c703101 created
datasets:imhg [2018/09/03 19:35]
Line 1: Line 1:
-==== Innsbruck Multi-View Hand Gesture (IMHG) Dataset ==== 
  
-Hand gestures constitute a natural forms of communication in human-robot interaction scenarios. They can be used to delegate tasks from a human to a robot. To facilitate human-like interaction with robots, a major requirement for advancing in this direction is the availability of a hand gesture dataset for judging the performance of the algorithms. 
- 
-**Dataset Features** 
- 
-  * 22 participants performed 8 hand gestures in the context of human-robot interaction scenarios taking place at close proximity. 
-  * 8 hand gestures categorized as: 
-     - 2 types of referencing (pointing) gestures with the ground truth location of the target pointed at,  
-     - 2 symbolic gestures, ​ 
-     - 2 manipulative gestures, ​ 
-     - 2 interactional gestures. 
-  * A corpus of 836 test scenarios (704 reference gestures with ground truth, and 132 other gestures). 
-  * Hand gestures recorded from two views (frontal and lateral) using an RGB-D Kinect sensor. 
-  * The data acquisition setup can be easily recreated using a polar coordinate pattern as shown in the figure below to add new hand gestures in the future. 
-{{ :​research:​projects:​3rdhand:​data_acq_scene.png?​nolink&​300 |}} 
-  * Soon to be released publicly. 
-  * Currently available for [[https://​iis.uibk.ac.at/​public/​datasets/​imhgdataset/​imhg_dataset.zip|Download]] (~888MB) with authentication. 
- 
-**Sample Scenarios** 
- 
-{{ :​research:​projects:​3rdhand:​image_samples.png?​nolink |}} 
- 
-Gestures recorded from frontal and side view. //T-B//: Finger pointing, Tool pointing, Thumb up (approve), Thumb down 
-(disapprove),​ Grasp open, Grasp close, Receive, Fist (stop). 
- 
-**Reference** 
- 
-Dadhichi Shukla, Ozgur Erkent, Justus Piater, The IMHG dataset: A Multi-View Hand Gesture RGB-D Dataset for Human-Robot Interaction. Towards Standardized Experiments in Human Robot Interactions,​ 2015 (Workshop at IROS). Extended Abstract.[[https://​iis.uibk.ac.at/​public/​papers/​Shukla-2015-StandardHRI.pdf|PDF]]. 
- 
-**BibTex** 
- 
-  @InProceedings{Shukla-2015-StandardHRI,​ 
-    title = {{The IMHG dataset: A Multi-View Hand Gesture RGB-D Dataset for Human-Robot Interaction}},​ 
-    author = {Shukla, Dadhichi and Erkent, Ozgur and Piater, Justus}, 
-    booktitle = {{Towards Standardized Experiments in Human Robot Interactions}},​ 
-    year = 2015, 
-    month = 10, 
-    note = {Workshop at IROS}, 
-    url = {https://​iis.uibk.ac.at/​public/​papers/​Shukla-2015-StandardHRI.pdf} 
-  } 
- 
-**Acknowledgement** 
- 
-This research has received funding from the European Community’s Seventh Framework Programme FP7/​2007-2013 (Specific Programme Cooperation,​ Theme 3, Information and Communication Technologies) under grant agreement no. 610878, [[http://​3rdhandrobot.eu/​|3rd HAND]]. 
- 
- 
-**Contact** 
- 
-dadhichi**[dot]**shukla**[at]**uibk**[dot]**ac**[dot]**at 
datasets/imhg.txt · Last modified: 2018/09/03 19:35 (external edit)