Host: The Japanese Society for Cognitive Psychology
Recent psychological studies often administered online survey that participants were collected from crowdsourcing services. Particularly, Amazon's Mechanical Turk (AMT) is becoming a popular tool for recruiting participants. In addition to survey research previously administered by a paper-and-pencil questionnaire, some empirical studies administered online behavioral experiments, and successfully replicated laboratory results. Although these attempts validated use of crowdsourcing as a participant pool, AMT has still some limitations. The major limitation is a diversity of sample that the majority of AMT participants were Caucasian residents in the United States. Present study aimed to validate use of non-AMT crowdsourcing that enabled to collect participants with different demographic characteristics from AMT. The results of five online experiments including precise millisecond control for stimulus presentation and response recording were generally in line with previous studies. This finding suggested the viability of online experimentation in cognitive research.