<?xml version="1.0" encoding="UTF-8"?>
<!-- generator="FeedCreator 1.8" -->
<?xml-stylesheet href="https://iis.uibk.ac.at/lib/exe/css.php?s=feed" type="text/css"?>
<rdf:RDF
    xmlns="http://purl.org/rss/1.0/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:dc="http://purl.org/dc/elements/1.1/">
    <channel rdf:about="https://iis.uibk.ac.at/feed.php">
        <title>IIS datasets</title>
        <description></description>
        <link>https://iis.uibk.ac.at/</link>
        <image rdf:resource="https://iis.uibk.ac.at/lib/tpl/iis/images/favicon.ico" />
       <dc:date>2026-04-12T21:13:38+0200</dc:date>
        <items>
            <rdf:Seq>
                <rdf:li rdf:resource="https://iis.uibk.ac.at/datasets/icare?rev=1606768892&amp;do=diff"/>
                <rdf:li rdf:resource="https://iis.uibk.ac.at/datasets/imhg?rev=1535996159&amp;do=diff"/>
                <rdf:li rdf:resource="https://iis.uibk.ac.at/datasets/ior?rev=1535996159&amp;do=diff"/>
                <rdf:li rdf:resource="https://iis.uibk.ac.at/datasets/ipo?rev=1535996159&amp;do=diff"/>
                <rdf:li rdf:resource="https://iis.uibk.ac.at/datasets/phoenix-annotations?rev=1535996159&amp;do=diff"/>
                <rdf:li rdf:resource="https://iis.uibk.ac.at/datasets/thumos14?rev=1726730243&amp;do=diff"/>
            </rdf:Seq>
        </items>
    </channel>
    <image rdf:about="https://iis.uibk.ac.at/lib/tpl/iis/images/favicon.ico">
        <title>IIS</title>
        <link>https://iis.uibk.ac.at/</link>
        <url>https://iis.uibk.ac.at/lib/tpl/iis/images/favicon.ico</url>
    </image>
    <item rdf:about="https://iis.uibk.ac.at/datasets/icare?rev=1606768892&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-11-30T21:41:32+0200</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>Innsbruck CNN Abstract Rule Eyetracking (ICARE) Dataset</title>
        <link>https://iis.uibk.ac.at/datasets/icare?rev=1606768892&amp;do=diff</link>
        <description>Innsbruck CNN Abstract Rule Eyetracking (ICARE) Dataset

Convolutional neural networks are widely used in image classification. But perform badly when it is an abstract rule like identity or symmetry. In this dataset we conducted a study with humans on three different datasets based on abstract rules. In addition to the study we used an eye tracker to gather data of participants' eye movements.</description>
    </item>
    <item rdf:about="https://iis.uibk.ac.at/datasets/imhg?rev=1535996159&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2018-09-03T19:35:59+0200</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>Innsbruck Multi-View Hand Gesture (IMHG) Dataset</title>
        <link>https://iis.uibk.ac.at/datasets/imhg?rev=1535996159&amp;do=diff</link>
        <description>Innsbruck Multi-View Hand Gesture (IMHG) Dataset

Hand gestures constitute a natural forms of communication in human-robot interaction scenarios. They can be used to delegate tasks from a human to a robot. To facilitate human-like interaction with robots, a major requirement for advancing in this direction is the availability of a hand gesture dataset for judging the performance of the algorithms.</description>
    </item>
    <item rdf:about="https://iis.uibk.ac.at/datasets/ior?rev=1535996159&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2018-09-03T19:35:59+0200</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>Innsbruck Object Relation Dataset</title>
        <link>https://iis.uibk.ac.at/datasets/ior?rev=1535996159&amp;do=diff</link>
        <description>Innsbruck Object Relation Dataset

This dataset contains the set of possible object-object spatial relations. Learning object-object relations is a difficult problem with sparse, noisy, corrupted and incomplete information which makes it an interesting and challenging machine learning problem. We formulate this problem as the problem of learning missing edges in a multigraph.</description>
    </item>
    <item rdf:about="https://iis.uibk.ac.at/datasets/ipo?rev=1535996159&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2018-09-03T19:35:59+0200</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>Innsbruck Pointing at Objects (IPO) Dataset</title>
        <link>https://iis.uibk.ac.at/datasets/ipo?rev=1535996159&amp;do=diff</link>
        <description>Innsbruck Pointing at Objects (IPO) Dataset

Deictic gestures – pointing at things in human-human collaborative tasks – constitute a pervasive, non-verbal
way of communication, used e.g. to direct attention towards objects of interest. In a human-robot interactive scenario, in order to delegate tasks from a human to a robot, one of the key requirements is to recognize and estimate the pose of the pointing gesture.</description>
    </item>
    <item rdf:about="https://iis.uibk.ac.at/datasets/phoenix-annotations?rev=1535996159&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2018-09-03T19:35:59+0200</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>Acknowledgment</title>
        <link>https://iis.uibk.ac.at/datasets/phoenix-annotations?rev=1535996159&amp;do=diff</link>
        <description>Acknowledgment

This work was partially funded by the European Community's Seventh Framework Programme FP7/2007-2013 (Specific Programme Cooperation, Theme 3, Information and Communication Technologies) under grant agreement no. 231424, SignSpeak.</description>
    </item>
    <item rdf:about="https://iis.uibk.ac.at/datasets/thumos14?rev=1726730243&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2024-09-19T09:17:23+0200</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>Thumos 14</title>
        <link>https://iis.uibk.ac.at/datasets/thumos14?rev=1726730243&amp;do=diff</link>
        <description>Thumos 14

Download

Here the links for the compressed (tar) files

	*  Background (tar.gz, 428 GB)
	*  Test  (tar, 711 GB)
	*  Train (tar.gz, 109 GB)
	*  Validation (tar, 567 GB)

At a later point this week there will be downloadable xz files available.</description>
    </item>
</rdf:RDF>
