The Audience Effect: Do Observations Change Outcomes in HCI Studies?

Huimin Tang, Boon Giin Lee, Dave Towey, Kaiyi Chen, Yichu Fang, Runzhou Zhang, Matthew Pike

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

Observational studies are widely used in Human-Computer Interaction (HCI) research to evaluate usability and user experience with technologies. However, the act of observation may influence participant behaviour and performance, threatening the validity of study findings. This paper investigates the impact of three observation types on participant outcomes in a simulated HCI study context. Participants completed Sudoku puzzles under baseline (no observation), human observation, sensor-based observation, and combined human/sensor conditions. Performance was assessed by puzzle completion rates. The mental workload was measured via NASA-TLX surveys, heart rate, galvanic skin response, and infrared thermal imaging. Results showed observations negatively impacted performance versus baseline, with human observers inducing the greatest distraction. Experienced participants were more influenced than novices. Task medium also affected engagement and observation reactivity. Findings demonstrate observations introduce bias in HCI research, emphasising careful consideration of observation methods to improve result validity.
Original languageEnglish
Title of host publication2024 IEEE 48th Annual Computers, Software, and Applications Conference (COMPSAC)
PublisherIEEE
Pages611-620
Number of pages10
ISBN (Electronic)9798350376968
ISBN (Print)9798350376975
DOIs
Publication statusPublished - 2024

Keywords

  • Human-Computer Interaction
  • Observation Effects
  • Task Performance
  • Mental Workload
  • Sudoku
  • Task Medium
  • Task Engagement

Fingerprint

Dive into the research topics of 'The Audience Effect: Do Observations Change Outcomes in HCI Studies?'. Together they form a unique fingerprint.

Cite this