- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Research Data /
- Assessing construct reliability through open-ended...
Open Collections
UBC Research Data
Assessing construct reliability through open-ended survey response analysis Koralesky, Katherine E.; von Keyserlingk, Marina A.G.; Weary, Daniel M.
Description
Online surveys often include quantitative attention checks, but inattentive participants might also be identified using their qualitative responses. We used the software Turnitin™ to assess the originality of open-ended responses in four mixed-method online surveys that included validated multi-item rating scales. Across surveys, 18-35% of participants were identified as having copied responses from online sources. We assessed indicator reliability and internal consistency reliability and found that both were lower for participants identified as using copied text versus those who wrote more original responses. Those who provided more original responses also provided more consistent responses to the validated scales, suggesting that these participants were more attentive. We conclude that this process can be used to screen qualitative responses from online surveys. We encourage future research to replicate this screening process using similar tools, investigate strategies to reduce copying behaviour, and explore the motivation of participants to search for information online.
Item Metadata
Title |
Assessing construct reliability through open-ended survey response analysis
|
Creator | |
Contributor | |
Date Issued |
2023-04-14
|
Description |
Online surveys often include quantitative attention checks, but inattentive participants might also be identified using their qualitative responses. We used the software Turnitin™ to assess the originality of open-ended responses in four mixed-method online surveys that included validated multi-item rating scales. Across surveys, 18-35% of participants were identified as having copied responses from online sources. We assessed indicator reliability and internal consistency reliability and found that both were lower for participants identified as using copied text versus those who wrote more original responses. Those who provided more original responses also provided more consistent responses to the validated scales, suggesting that these participants were more attentive. We conclude that this process can be used to screen qualitative responses from online surveys. We encourage future research to replicate this screening process using similar tools, investigate strategies to reduce copying behaviour, and explore the motivation of participants to search for information online.
|
Subject | |
Type | |
Language |
English
|
Date Available |
2023-04-13
|
Provider |
University of British Columbia Library
|
License |
CC-BY 4.0
|
DOI |
10.14288/1.0431052
|
URI | |
Publisher DOI | |
Rights URI | |
Country |
United States
|
Aggregated Source Repository |
Dataverse
|
Item Media
Item Citations and Data
Licence
CC-BY 4.0