UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Improving prediction of user cognitive abilities and performance for user-adaptive narrative visualizations by leveraging eye-tracking data from multiple user studies Iranpour, Alireza


Previous work leveraged eye-tracking to predict a user’s levels of cognitive abilities and performance while reading magazine style narrative visualizations (MSNV), a common type of multimodal document which combines text and visualization to narrate a story. The eye-tracking data, used for training the classifiers, came from a user study, called control, where subjects simply read through MSNVs without receiving any type of adaptive guidance, otherwise known as the control condition. The goal was to capture the relationship between users’ normal MSNV processing and their levels of cognitive abilities and performance and use that to drive personalization. In addition to the control study, two other user studies were also previously conducted to investigate the benefits of adaptive support, also known as adaptive studies. In these studies, subjects were provided with gaze-based interventions to facilitate their processing of the MSNVs. In the control study, there was no intervention, and the MSNVs did not adapt to the users in any way because the idea was to make the predictions based on users normal, unguided MSNV processing and then use those predictions to deliver the appropriate adaptations. In the adaptive studies, however, interventions influenced the way subjects processed the MSNVs. As a result, their gaze behavior did not represent how they would have behaved in the intended control condition, and a classifier merely trained on the eye-tracking data from these studies would not learn the proper relationship. In this thesis, we propose different strategies for combining the additional eye-tracking data from the adaptive studies with our original data from the control study to mitigate the potential differences and to form more consistent combinations conducive to improved performance. Our results show that the additional eye-tracking data can significantly improve the accuracy of our classifiers.

Item Citations and Data


Attribution-NonCommercial-NoDerivatives 4.0 International