We are providing two public data sets for the purposes of activity and object recognition.
The first data set (cooking) is dedicated to explore methods for the recognition of human everyday activities in realistic scenarios. In this case we capture the human motions during two cooking scenarios. The uniqueness of our proposed data set is that the human motions were captured using three external cameras and two egocentric cameras.
The second data set (objects) is available to analyze techniques for the recognition of objects under different constraints and lighting conditions. The images stored in this data set were obtained from the cameras of the iCub humanoid robot, which contain low and high resolution images, thus making major variability of light and background constraints in the obtained images.
This data is free for use in research projects. If you publish results obtained using this data, we would appreciate it if you acknowledge our work and would send the citation to your published paper to firstname.lastname@example.org.
The cooking data sets, which explore more realistic scenarios, such as making pancakes or making sandwiches for the reason that in these scenarios several goal-directed movements can be observed, which could be typically perceived in such natural environments.
The object data set contains 16,500 training images, 8,250 validation images and 8,250 testing images with a resolution of 240×240. The 16,500 training images are created by 500 images per setting × 3 settings × 11 object.