Abstract
Observational studies are widely used in Human-Computer Interaction (HCI) research to evaluate usability and user experience with technologies. However, the act of observation may influence participant behaviour and performance, threatening the validity of study findings. This paper investigates the impact of three observation types on participant outcomes in a simulated HCI study context. Participants completed Sudoku puzzles under baseline (no observation), human observation, sensor-based observation, and combined human/sensor conditions. Performance was assessed by puzzle completion rates. The mental workload was measured via NASA-TLX surveys, heart rate, galvanic skin response, and infrared thermal imaging. Results showed observations negatively impacted performance versus baseline, with human observers inducing the greatest distraction. Experienced participants were more influenced than novices. Task medium also affected engagement and observation reactivity. Findings demonstrate observations introduce bias in HCI research, emphasising careful consideration of observation methods to improve result validity.
Original language | English |
---|---|
Title of host publication | 2024 IEEE 48th Annual Computers, Software, and Applications Conference (COMPSAC) |
Publisher | IEEE |
Pages | 611-620 |
Number of pages | 10 |
ISBN (Electronic) | 9798350376968 |
ISBN (Print) | 9798350376975 |
DOIs | |
Publication status | Published - 2024 |
Keywords
- Human-Computer Interaction
- Observation Effects
- Task Performance
- Mental Workload
- Sudoku
- Task Medium
- Task Engagement