Humans with incipient dementia are still able to master their daily life self-dependently. They try to bypass their phases of confusion unnoticed by others and continue participating in public life as long as possible. While we consider a lost moment at home within a sheltered environment rather uncritical, disorientation in transit may have severe consequences. Smart technical support systems may help affected persons to recall their temporarily forgotten intensions on their own. Yet, such systems must react automatically, i.e., without any interaction by their users, and context-sensitively offer aid just in the moment of confusion. Electronic reminders and to-do-lists are ineffective when disoriented persons do not think about their smartphones.
This is the point for our considerations: We want to know whether it is technically feasible to automatically detect moments of confusion and to quantify them– and if yes, which measuring method would be the best concerning public mobility? We know that dementia cannot be measured via brain waves, and also other vital signs are inappropriate for real-time recognition. Hence, we focus on visually and kinetically identifiable symptoms, like uncontrolled or abnormal movements of body or organs (e.g., repetitive head turns or fast eye movements).
We want to explore if typical symptoms of confusion can be mapped to behavioural patterns, classify them and investigate technical measuring methods for reliable detection. This includes body-worn gadgets (e.g., accelerometers or eye-tracking cameras) as well as infrastructural solutions (e.g., depth cameras for analysing movement patterns– keyword: attention estimation / gesture recognition).
The objective of this project is to design and implement various technical procedures for confusion recognition, conduct a study with affected patients and compare the measurements in terms of reliability, practical application, and economic viability. The results of the study are supposed to create a basis for further application-oriented research, for developing suitable user-interfaces for context-based support systems automatically reacting in the moment of confusion.