Home Data
Home Data

Video Surveillance dataset for People Matching

This dataset has been acquired and annotated to test the ability of algorithms to work in a constraint free settings: the 5 cameras differ for resolution, position, framerate. The scenario is non planar. A camera is hand held and moves freely and no calibration information is provided.

Each video is 110 second long and contains manual annotations of each track using ID that are shared between cameras.



The CONTACT Visuo-Motor Grasping dataBase

A grasping action from the VMGdB
Grasping is one of the most interesting challenges in nowadays robotics, posing problems to the mechanical and electronic engineer, the computer vision researcher, the control theorist and, more recently, the neuroscientist. The study of human grasping has proved beneficial to get a better understanding of the problem.

We present VMGdB, the CONTACT Visuo-Motor Grasping Database, a recording of grasping actions performed by 20 human subjects on 7 objects using 5 ways of grasping, under variable illumination conditions.

The VMGdB consists of 5200 grasping acts organized in 260 data entries — each of which made of 2 video sequences recorded from two colour cameras, and motor data recorded from a sensorised glove. Labeled data are available as standard AVI videos and a file of ASCII outputs from the glove.