Blurring the boundary between models and reality::Visual perception of scale assessed by performance

Abstract

One of the primary jobs of visual perception is to build a three-dimensional representation of the world around us from our flat retinal images. These are a rich source of depth cues but no single one of them can tell us about scale (i.e., absolute depth and size). For example, the pictorial depth cues in a (perfect) scale model are identical to those in the real scene that is being modelled. Here we investigate image blur gradients, which derive naturally from the limited depth of field available for any optical device and can be used to help estimate visual scale. By manipulating image blur artificially to produce what is sometimes called fake tilt shift miniaturization, we provide the first performance-based evidence that human vision uses this cue when making forced-choice judgements about scale (identifying which of an image pair was a photograph of a full-scale railway scene, and which was a 1:76 scale model). The orientation of the blur gradient (relative to the ground plane) proves to be crucial, though its rate of change is less important for our task, suggesting a fairly coarse visual analysis of this image parameter.

Publication DOI: https://doi.org/10.1371/journal.pone.0285423
Divisions: College of Health & Life Sciences > School of Optometry > Optometry
College of Health & Life Sciences > School of Optometry > Optometry & Vision Science Research Group (OVSRG)
College of Health & Life Sciences > Clinical and Systems Neuroscience
College of Health & Life Sciences > School of Optometry > Vision, Hearing and Language
College of Health & Life Sciences > School of Optometry > Centre for Vision and Hearing Research
College of Health & Life Sciences
Additional Information: Copyright: © 2023 Meese et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Funding: This work was supported by research grant EP/H000038/1 from the Engineering and Physical Sciences Research Council awarded to Tim Meese and Mark Georgeson. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Data Availability: The raw data, code and stimuli for this study can be found on the OSF repository at: https://doi.org/10.17605/OSF.IO/4K2XP
Uncontrolled Keywords: Cues,Gravitation,Humans,Depth Perception,Visual Perception,Judgment
Publication ISSN: 1932-6203
Last Modified: 02 May 2024 07:21
Date Deposited: 09 May 2023 09:54
Full Text Link:
Related URLs: https://osf.io/4k2xp/ (Related URL)
https://journal ... al.pone.0285423 (Publisher URL)
http://www.scop ... tnerID=8YFLogxK (Scopus URL)
PURE Output Type: Article
Published Date: 2023-05-08
Published Online Date: 2023-05-08
Accepted Date: 2023-04-21
Submitted Date: 2022-11-14
Authors: Meese, Tim S. (ORCID Profile 0000-0003-3744-4679)
Baker, Daniel H.
Summers, Robert J.

Download

[img]

Version: Published Version

License: Creative Commons Attribution

| Preview

Export / Share Citation


Statistics

Additional statistics for this record