File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Visual psychophysics on the web: open-access tools, experiments, and results using online platforms
Title | Visual psychophysics on the web: open-access tools, experiments, and results using online platforms |
---|---|
Authors | |
Issue Date | 2018 |
Publisher | Association for Research in Vision and Ophthalmology. The Journal's web site is located at http://wwwjournalofvisionorg/ |
Citation | 18th Annual Meeting of Vision Sciences Society (VSS 2018), St. Pete Beach, FL, 18-23 May 2018, In Journal of Vision, 2018, v. 18 n. 10, p. 299-299 How to Cite? |
Abstract | Abstract
In the last several years, web-based experiments with visual stimuli have become increasingly common as researchers have utilized online paradigms to facilitate fast data collection with large samples. However, few open-access tools exist for conducting rigorous visual psychophysical studies on the internet. Here, we present new tools to enable vision science in web browsers, as well as sample experiments and results which demonstrate their viability. These tools include several methods to estimate psychophysical threshold parameters that run entirely in JavaScript/CSS/HTML, including the PEST adaptive staircase procedure and the Confidence Signal Detection model (Yi & Merfeld, 2016), which leverages confidence judgments to estimate thresholds with a small number of trials. We also present the first open-access random-dot kinematogram which runs entirely in web browsers and includes parameters to customize coherence levels, aperture shape, dot size, and other features. Our initial experiments on human motion perception demonstrate three important findings: (1) with our tools, motion threshold parameters estimated from online subjects are comparable to those estimated in controlled laboratory environments; (2) our web-based implementation of new methods facilitates faster threshold estimation than traditional methods; (3) data from online subjects indicates these participants are much more demographically diverse than studies from university laboratories. We have also developed new paradigms for testing peripheral color perception online, and results show that observers often overestimate how saturated parafoveal visual stimuli truly are. Finally, we will discuss results from recent investigations investigating differences between foveal and parafoveal motion perception. Together, these experiments demonstrate that despite sacrificing a degree of experimental control, rigorous web-based psychophysics is quite possible, as our initial results provide promising evidence to motivate future development of online tools for visual science. |
Description | abstract |
Persistent Identifier | http://hdl.handle.net/10722/276274 |
ISSN | 2023 Impact Factor: 2.0 2023 SCImago Journal Rankings: 0.849 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Rajananda, S | - |
dc.contributor.author | Peters, MAK | - |
dc.contributor.author | Lau, HW | - |
dc.contributor.author | Odegaard, B | - |
dc.date.accessioned | 2019-09-10T02:59:35Z | - |
dc.date.available | 2019-09-10T02:59:35Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | 18th Annual Meeting of Vision Sciences Society (VSS 2018), St. Pete Beach, FL, 18-23 May 2018, In Journal of Vision, 2018, v. 18 n. 10, p. 299-299 | - |
dc.identifier.issn | 1534-7362 | - |
dc.identifier.uri | http://hdl.handle.net/10722/276274 | - |
dc.description | abstract | - |
dc.description.abstract | Abstract In the last several years, web-based experiments with visual stimuli have become increasingly common as researchers have utilized online paradigms to facilitate fast data collection with large samples. However, few open-access tools exist for conducting rigorous visual psychophysical studies on the internet. Here, we present new tools to enable vision science in web browsers, as well as sample experiments and results which demonstrate their viability. These tools include several methods to estimate psychophysical threshold parameters that run entirely in JavaScript/CSS/HTML, including the PEST adaptive staircase procedure and the Confidence Signal Detection model (Yi & Merfeld, 2016), which leverages confidence judgments to estimate thresholds with a small number of trials. We also present the first open-access random-dot kinematogram which runs entirely in web browsers and includes parameters to customize coherence levels, aperture shape, dot size, and other features. Our initial experiments on human motion perception demonstrate three important findings: (1) with our tools, motion threshold parameters estimated from online subjects are comparable to those estimated in controlled laboratory environments; (2) our web-based implementation of new methods facilitates faster threshold estimation than traditional methods; (3) data from online subjects indicates these participants are much more demographically diverse than studies from university laboratories. We have also developed new paradigms for testing peripheral color perception online, and results show that observers often overestimate how saturated parafoveal visual stimuli truly are. Finally, we will discuss results from recent investigations investigating differences between foveal and parafoveal motion perception. Together, these experiments demonstrate that despite sacrificing a degree of experimental control, rigorous web-based psychophysics is quite possible, as our initial results provide promising evidence to motivate future development of online tools for visual science. | - |
dc.language | eng | - |
dc.publisher | Association for Research in Vision and Ophthalmology. The Journal's web site is located at http://wwwjournalofvisionorg/ | - |
dc.relation.ispartof | Journal of Vision | - |
dc.title | Visual psychophysics on the web: open-access tools, experiments, and results using online platforms | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Lau, HW: oldchild@hku.hk | - |
dc.identifier.authority | Lau, HW=rp02270 | - |
dc.identifier.doi | 10.1167/18.10.299 | - |
dc.identifier.hkuros | 304560 | - |
dc.identifier.volume | 18 | - |
dc.identifier.issue | 10 | - |
dc.identifier.spage | 299 | - |
dc.identifier.epage | 299 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 1534-7362 | - |