Nowadays, indications and warnings, e.g. with driver assistance systems, are usually designed purely as function based. The interaction concepts on which they are based hardly allow user-specific adjustments. Key aspects of human interaction, such as social and emotional information, are not taken into account here.
The project is about how information on the social and emotional state of the driver can be usefully integrated into assistance systems. Through a user-centric individualization of driver assistance systems, they should respond to emotional, social and cognitive states of the driver – just as a human passenger would.
Therefore, pattern recognition algorithms to classify social-emotional driver states are developed. Their effects on cognition and performance of the driver are examined. The findings are included in the development of an adaptive user model, which allows predictions on cognitive parameters on the basis of the driver’s state. Within the necessary observation of the vehicle interior, the special requirements and regulations on data protection and data security are taken into account.
To improve the system acceptance and usability, basic conditions for future integration of social and emotional information in user-centered interaction concepts are established within the project.