Detecting & interpreting self-manipulating hand movements for student’s affect prediction
1 Department of computer Science, Asian Institute of Technology Bangkok, Bangkok, Thailand
2 Design Engineering Laboratory, KINPOE, Karachi, Pakistan
Human-centric Computing and Information Sciences 2012, 2:14 doi:10.1186/2192-1962-2-14Published: 3 August 2012
In this paper, we report on development of a non-intrusive student mental state prediction system from his (her) unintentional hand-touch-head (face) movements.
Hand-touch-head (face) movement is a typical case of occlusion of otherwise easily detectable image features due to similar skin color and texture, however, in our proposed scheme, i.e., the Sobel-operated local binary pattern (SLBP) method using force field features. We code six different gestures of more than 100 human subjects, and use these codes as manual input to a three-layered Bayesian network (BN). The first layer holds mental state to gesture relationships obtained in an earlier study while the second layer embeds gesture and SLBP generated binary codes.
We find it very successful in separating hand (s) from face region in varying illuminating conditions. The proposed scheme when evaluated on a novel data set is found promising resulting with an accuracy of about 85%.
The framework will be utilized for developing intelligent tutoring system.