2020 FG in E-health and welfare Workshop FG
Workshop description
1st Workshop on
Faces and Gestures in E-health and Welfare (FaGEW)
Workshop at IEEE International Conference on Automatic Face and Gesture Recognition 2020
Recent advances in technology have boosted the development and release of active assistive living devices, based on wearable and/or non obtrusive visual and multi-modal signals for e-health and welfare support. These solutions are seamlessly integrated in the environment, such as sensor-based systems installed in elderly people’s homes for ambient monitoring and intelligent visual warning.
In addition, research on ubiquitous computing has favored the implementation of more user-centered applications such as virtual tutoring, coaching agents, physical rehabilitation and psychological therapy systems. To allow these systems to provide features that satisfy the user's requirements, expectations, and acceptance, the tendency is now shifting towards the conception of empathic solutions, tailored to personalized user needs. The new assistive systems must be able to understand user’s behaviors, mood and intentions, and react to them accordingly in real time, as well as detect timely changes in behaviors and health states. Furthermore, such systems are expected to infer the user’s traits, attitudes and psychological profile to deliver a more personalized user-machine interaction. These solutions require advanced computer vision and machine learning techniques, such as facial expression analysis, gaze and pose estimation, and gesture recognition, in addition to behavioral and psychological theories for modeling individual’s profiles. While these tasks are currently obtaining outstanding performances in controlled and prototypical environments (e.g., detection of facial expressions of emotion on static faces), the challenge falls on their integration and application in naturalistic scenarios, where extensive sources of variability (pose, age, behaviors, moods, illumination conditions, dynamic speaking emotional faces, among others) affect the processing of the detected signals.
The Faces and Gestures in E-health and Welfare workshop aims to provide a common venue for multidisciplinary researchers and practitioners of this area to share their latest approaches and findings, as well as to discuss the current challenges of machine learning and computer vision based e-health and welfare applications. The focus is on the employment of single or multi-modal face, gesture and pose analysis. We expect this workshop to increase the visibility and importance of this area, and contribute, in the short term, in pushing the state of the art in the automatic analysis of human behaviors for health and wellbeing applications.
Topics of interest include, but are not limited to:
- Multi-modal integration
- Psychological profiling from (audio)-visual and/or multi-modal data
- Approaches based on psychology behavioral models
- Mobile-based and human-computer interaction applications
- User-understanding in human-computer interaction
- Human behavior analysis for health and wellbeing support
- Assistive technologies for supporting vulnerable people
- Virtual avatars and coaching
- Physical and psychological therapy systems
- Assistive care
- User acceptance of empathic assistive systems
- Real-time applications
- Datasets