More and more social surveys are conducted online now than ever. Therefore, we investigated a method, planting instructed response items (IRIs) in a survey, to eliminate inattentive respondents as a necessary screening. Two web surveys were conducted; there were four IRIs in study 1 (n=2,490) and three in study 2 (n=2,000). The objectives were twofold; finding an appropriate number of IRIs in a web survey and finding the differences between the two groups, the original and the screened data, categorized by IRIs. In study 1, of the respondents who passed the first three IRIs, 1,935 out of 2,490 were considered attentive; the rest (555) were eliminated from the data analysis based on the response tree analysis. The two groups were compared on the quality-of-life scale with 24 items, with all re-spondents and only the attentive. The difference in mean scores between the two groups was statisti-cally significant. Still, the difference was minor because of the shared respondents, 1,935 respondents between the two groups. The item characteristic curves from the 2-parameter logistic model were com-pared between the attentive (n=1,935) and the inattentive (n=555) respondents. The differences were distinctively visible, and the decision to eliminate 555 respondents was supported. In study 2, the birthday was asked and used to calculate the age. Then, the calculated age was checked by comparing the provided age by the web survey company. The rates of correct responses increased monotonically with higher levels of attentiveness. We conclude that evidence indicates IRIs function well for detecting inattentive respondents. We also tentatively recommend that three IRIs in a survey work well to detect inattentive respondents. Finally, the treatment of the respondents with the gray-zone attentiveness was discussed.
View full abstract