|
ProgramThe program encompasses keynotes by Dr. Pushmeet Kohli and Dr. Gualtiero Volpe, four oral sessions, two poster sessions, and one panel session. The tentative program is as follows: Welcome [9:00–9:10] Creative applications of human behavior understanding, Interactions in Arts, Creativity, Entertainment, and Edutainment[9:10–10:00] Keynote: Multimodal Systems for Embodied Experience of Music and Audiovisual Content [10:00–10:20] A Behavioral Study on the Effects of Rock Music on Auditory Attention [10:20–10:40] Human Nonverbal Behaviour Understanding in the Wild for New Media Art [10:40–11:00] Creative Dance: an Approach for Social Interaction Between Robots and Children [11:00–11:30]Coffee break and poster session IStylistic features for affect-based movie recommendations ATTENTO: ATTENTion Observed for Automated Spectator Crowd Monitoring Human Behavior Understanding with wide area sensing floors Real-Time Comprehensive Sociometrics for Two-Person Dialogs Social and affective signals I[11:30–11:50] NovA: Automated Analysis Of Nonverbal Signals In Social Interactions [11:50–12:10] Towards Real-time Continuous Emotion Recognition from Body Movements [12:10–12:30] Head, shoulders and hips behaviors during turning [12:30–12:50] Social behavior modeling based on Incremental Discrete Hidden Markov Models [12:50–14:40] Lunch breakAction and activity recognition[14:40–15:30] Keynote: Learning to Interact (Naturally) with (All) Users [15:30–15:50] Transfer Learning of Human Poses for Action Recognition [15:50–16:10] Dynamic Feature Selection for Online Action Recognition [16:10–16:30] A Fully Unsupervised Approach to Activity Discovery [16:30–17:00]Coffee break and poster session IIEfficient Graph Construction for Label Propagation based Multi-observation Face Recognition Multiple Local Curvature Gabor Binary Patterns for Facial Action Recognition A Dense Deformation Field for Facial Expression Analysis in Dynamic Sequences of 3D Scans MMLI: Multimodal Multiperson Corpus of Laughter in Interaction Social and affective signals II[17:00–17:20] Human behaviour in HCI: Complex Emotion Detection through Sparse Speech Features [17:20–17:40] VIP: A complete framework for computational eye-gaze research [17:40–18:30] Panel discussion: Challenges in creative applications of human behavior understandingGualtiero Volpe, Pushmeet Kohli, Rita Cucchiara, Albert Ali Salah, Hayley Hung, Oya Aran, Hatice Gunes * Erratum to "ATTENTO: ATTENTion Observed for Automated Spectator Crowd Monitoring": The third formula should have only the absolute value, and no fraction. Download correct formula here. ** Erratum to "Head, shoulders and hips behaviors during turning": In the conclusions, it is stated that the occurrence of the maximum head yaw was also affected by the turn angle, leading to a linear increase from 35 degrees to 180 degrees. This range should be from 45 degrees to 180 degrees. |