4DHumanPercept
DHumanPercept is the first dataset of virtual humans animations acquired using a 4D acquisition system and distorted along controlled factors with corresponding perceptual similarity labels. The dataset is composed of a training and validation dataset and a test dataset :
-
The former involves 240 stimuli created from 8 acquired reference animations with different actors (1 female, 1 male), motions (walk, hop) and clothing (tight, loose), each distorted by 6 error types in 5 distortion levels each.
-
The latter is composed of 10 stimuli resulting of applying randomly one level of distortion on 8 new acquired reference animations coming from 5 subjects (2 female, 3 male) in either tight or loose outfits, exhibiting the motions walk or hop.