AbstractAbstractHuman behavior and affect is inherently a dynamic phenomenon involving temporal evolution of patterns manifested through a multiplicity of non-verbal behavioral cues including facial expressions, body postures and gestures, and vocal outbursts. A natural assumption for human behavior modeling is that a continuous-time characterization of behavior is the output of a linear time-invariant system when behavioral cues act as the input (e.g., continuous rather than discrete annotations of dimensional affect). Here we study the learning of such dynamical system under real-world conditions, namely in the presence of noisy behavioral cues descriptors and possibly unreliable annotations by employing structured rank minimization. To this end, a novel structured rank minimization method and its scalable variant are proposed. The generalizability of the proposed framework is demonstrated by conducting experiments on 3 distinct dynamic behavior analysis tasks, namely (i) conflict intensity prediction, (ii) prediction of valence and arousal, and (iii) tracklet matching. The attained results outperform those achieved by other state-of-the-art methods for these tasks and, hence, evidence the robustness and effectiveness of the proposed approach.
Georgakis, Christos <http://eprints.mdx.ac.uk/view/creators/Georgakis=3AChristos=3A=3A.html> and Panagakis, Yannis <http://eprints.mdx.ac.uk/view/creators/Panagakis=3AYannis=3A=3A.html> and Pantic, Maja <http://eprints.mdx.ac.uk/view/creators/Pantic=3AMaja=3A=3A.html> (2018) Dynamic behavior analysis via structured rank minimization. International Journal of Computer Vision, 126 (2-4). pp. 333-357. ISSN 1573-1405