Read by QxMD icon Read

Attention, Perception & Psychophysics

Irina M Harris, Justin A Harris, Michael C Corballis
We tested whether an object's orientation is inherently bound to its identity in a holistic view-based representation at the early stages of visual identification, or whether identity and orientation are represented separately. Observers saw brief and masked stimulus sequences containing two rotated objects. They had to detect if a previously cued object was present in the sequence and report its orientation. In Experiments 1 and 2, the objects were presented sequentially in the same spatial location for 70 ms each, whereas in Experiments 3 and 4 they were presented simultaneously in different spatial locations for 70 ms and 140 ms, respectively...
February 15, 2019: Attention, Perception & Psychophysics
Dennis Reike, Wolf Schwarz
Following the classical work of Moyer and Landauer (1967), experimental studies investigating the way in which humans process and compare symbolic numerical information regularly used one of two experimental designs. In selection tasks, two numbers are presented, and the task of the participant is to select (for example) the larger one. In classification tasks, a single number is presented, and the participant decides if it is smaller or larger than a predefined standard. Many findings obtained with these paradigms fit in well with the notion of a mental analog representation, or an Approximate Number System (ANS; e...
February 13, 2019: Attention, Perception & Psychophysics
Sin Hang Lau, Yaqian Huang, Victor S Ferreira, Edward Vul
The relationships between word frequency and various perceptual features have been used to study the cognitive processes involved in word production and recognition, as well as patterns in language use over time. However, little work has been done comparing spoken and written frequencies against each other, which leaves open the question of whether there are modality-specific relationships between perceptual features and frequency. Words have different frequencies in speech and written texts, with some words occurring disproportionately more often in one modality than the other...
February 13, 2019: Attention, Perception & Psychophysics
Hyunji Kim, Moritz Stolte, Glyn W Humphreys
We report a new "now-bias" effect on simple perceptual matching between shapes and labels and examined the relation between this now-bias effect and the self-bias previously established with this task (Sui, He, & Humphreys, Journal of Experimental Psychology: Human Perception and Performance, 38, 1105-1117, 2012). The perceptual biases favoring present-relevant and self-relevant information were correlated with each other, suggesting a common underlying mechanism. Nevertheless, temporal biases in decision making, specifically in temporal discounting, correlated with the perceptual self-bias but not with the perceptual now-bias...
February 13, 2019: Attention, Perception & Psychophysics
Casey L Roark, Lori L Holt
Human category learning appears to be supported by dual learning systems. Previous research indicates the engagement of distinct neural systems in learning categories that require selective attention to dimensions versus those that require integration across dimensions. This evidence has largely come from studies of learning across perceptually separable visual dimensions, but recent research has applied dual system models to understanding auditory and speech categorization. Since differential engagement of the dual learning systems is closely related to selective attention to input dimensions, it may be important that acoustic dimensions are quite often perceptually integral and difficult to attend to selectively...
February 13, 2019: Attention, Perception & Psychophysics
Emily M Crowe, Christina J Howard, Angela S Attwood, Christopher Kent
In standard multiple object tracking (MOT) tasks the relative importance of the targets being tracked is equal. This is atypical of everyday situations in which an individual may need to prioritize one target relative to another and so allocate attention unequally. We report three experiments that examined whether participants could unequally split attention using a modified MOT task in which target priority was manipulated. Specifically, we examined the effect of priority on participants' magnitude of error and used a distribution mixture analysis to investigate how priority affected both participants' probability of losing an item and tracking precision...
February 13, 2019: Attention, Perception & Psychophysics
Aiping Xiong, Robert W Proctor, Howard N Zelaznik
Three experiments used compatible and incompatible mappings of images of eating utensils to test the hypothesis that these images activate affordances for grasping with the corresponding hand when the required response is a key-press. In Experiment 1, stimuli were photographs of a plastic spoon oriented on the horizontal axis, with the handle location varying randomly between left and right. Participants were instructed to respond to the handle or the tip, with a compatible mapping in one trial block and an incompatible mapping in another...
February 13, 2019: Attention, Perception & Psychophysics
William J Harrison, Reuben Rideaux
The extent to which visual inference is shaped by attentional goals is unclear. Voluntary attention may simply modulate the priority with which information is accessed by the higher cognitive functions involved in perceptual decision making. Alternatively, voluntary attention may influence fundamental visual processes, such as those involved in segmenting an incoming retinal signal into a structured scene of coherent objects, thereby determining perceptual organization. Here we tested whether the segmentation and integration of visual form can be determined by an observer's goals, by exploiting a novel variant of the classical Kanizsa figure...
February 13, 2019: Attention, Perception & Psychophysics
Christopher C Heffner, William J Idsardi, Rochelle S Newman
Phonetic categories must be learned, but the processes that allow that learning to unfold are still under debate. The current study investigates constraints on the structure of categories that can be learned and whether these constraints are speech-specific. Category structure constraints are a key difference between theories of category learning, which can roughly be divided into instance-based learning (i.e., exemplar only) and abstractionist learning (i.e., at least partly rule-based or prototype-based) theories...
February 13, 2019: Attention, Perception & Psychophysics
Eleni Vlahou, Aaron R Seitz, Norbert KopĨo
Speech intelligibility is adversely affected by reverberation, particularly when listening to a foreign language. However, little is known about how phonetic learning is affected by room acoustics. This study investigated how room reverberation impacts the acquisition of novel phonetic categories during implicit training in virtual environments. Listeners were trained to distinguish a difficult nonnative dental-retroflex contrast in phonemes presented either in a fixed room (anechoic or reverberant) or in multiple anechoic and reverberant spaces typical of everyday listening...
February 8, 2019: Attention, Perception & Psychophysics
Sung-Joo Lim, Barbara G Shinn-Cunningham, Tyler K Perrachione
Speech processing is slower and less accurate when listeners encounter speech from multiple talkers compared to one continuous talker. However, interference from multiple talkers has been investigated only using immediate speech recognition or long-term memory recognition tasks. These tasks reveal opposite effects of speech processing time on speech recognition - while fast processing of multi-talker speech impedes immediate recognition, it also results in more abstract and less talker-specific long-term memories for speech...
February 8, 2019: Attention, Perception & Psychophysics
Claudia Bonmassar, Francesco Pavani, Wieske van Zoest
Gaze and arrow cues cause covert attention shifts even when they are uninformative. Nonetheless, it is unclear to what extent oculomotor behavior influences manual responses to social and nonsocial stimuli. In two experiments, we tracked the gaze of participants during the cueing task with nonpredictive gaze and arrow cues. In Experiment 1, the discrimination task was easy and eye movements were not necessary, whereas in Experiment 2 they were instrumental in identifying the target. Validity effects on manual response time (RT) were similar for the two cues in Experiment 1 and in Experiment 2, though in the presence of eye movements observers were overall slower to respond to the arrow cue compared with the gaze cue...
February 8, 2019: Attention, Perception & Psychophysics
Hanshin Kim, Bo Youn Park, Yang Seok Cho
Previous studies have demonstrated that attentional capture occurs based on attentional control settings. These settings specify what features are selected for processing as well as what features are filtered out. To examine how attentional control settings are flexibly constructed when target and/or distractor features are uncertain, the current paper presents four experiments in which the numbers of target and distractor features were manipulated. The results showed that attentional control settings were configured in terms of a fixed feature when either the target or the distractor feature was uncertain and the other was fixed over trials...
February 7, 2019: Attention, Perception & Psychophysics
Jonathan M Frazier, Ashley A Assgari, Christian E Stilp
Auditory perception is shaped by spectral properties of surrounding sounds. For example, when spectral properties differ between earlier (context) and later (target) sounds, this can produce spectral contrast effects (SCEs; i.e., categorization boundary shifts) that bias perception of later sounds. SCEs affect perception of speech and nonspeech sounds alike (Stilp Alexander, Kiefte, & Kluender in Attention, Perception, & Psychophysics, 72(2), 470-480, 2010). When categorizing speech sounds, SCE magnitudes increased linearly with greater spectral differences between contexts and target sounds (Stilp, Anderson, & Winn in Journal of the Acoustical Society of America, 137(6), 3466-3476, 2015; Stilp & Alexander in Proceedings of Meetings on Acoustics, 26, 2016; Stilp & Assgari in Journal of the Acoustical Society of America, 141(2), EL153-EL158, 2017)...
February 6, 2019: Attention, Perception & Psychophysics
Ru Qi Yu, Yu Luo, Daniel Osherson, Jiaying Zhao
A challenge for the visual system is to detect regularities from multiple dimensions of the environment. Here we examine how regularities in multiple feature dimensions are distinguished from randomness. Participants viewed a matrix containing a structured half and a random half, and judged whether the boundary between the two halves was horizontal or vertical. In Experiments 1 and 2, the cells in the matrix varied independently in the color dimension (red or blue), the shape dimension (circle or square), or both...
February 6, 2019: Attention, Perception & Psychophysics
Steven Greenberg, Thomas U Christiansen
Over a long and distinguished career, Randy Diehl has elucidated the brain mechanisms underlying spoken language processing. The present study touches on two of Randy's central interests, phonetic features and Bayesian statistics. How does the brain go from sound to meaning? Traditional approaches to the study of speech intelligibility and word recognition are unlikely to provide a definitive answer. A finer-grained, Bayesian-inspired approach may help. In this study, listeners identified 11 Danish consonants spoken in a Consonant + Vowel + [l] environment...
January 31, 2019: Attention, Perception & Psychophysics
Mick Zeljko, Ada Kritikos, Philip M Grove
We tested the sensory versus decisional origins of two established audiovisual crossmodal correspondences (CMCs; lightness/pitch and elevation/pitch), applying a signal discrimination paradigm to low-level stimulus features and controlling for attentional cueing. An audiovisual stimulus randomly varied along two visual dimensions (lightness: black/white; elevation: high/low) and one auditory dimension (pitch: high/low), and participants discriminated either only lightness, only elevation, or both lightness and elevation...
January 29, 2019: Attention, Perception & Psychophysics
Brian A Anderson, Haena Kim
Reward history, physical salience, and task relevance all influence the degree to which a stimulus competes for attention, reflecting value-driven, stimulus-driven, and goal-contingent attentional capture, respectively. Theories of value-driven attention have likened reward cues to physically salient stimuli, positing that reward cues are preferentially processed in early visual areas as a result of value-modulated plasticity in the visual system. Such theories predict a strong coupling between value-driven and stimulus-driven attentional capture across individuals...
January 29, 2019: Attention, Perception & Psychophysics
Oscar Kovacs, Irina M Harris
Location appears to play a vital role in binding discretely processed visual features into coherent objects. Consequently, it has been proposed that objects are represented for cognition by their spatiotemporal location, with other visual features attached to this location index. On this theory, the visual features of an object are only connected via mutual location; direct binding cannot occur. Despite supporting evidence, some argue that direct binding does take over according to task demands and when representing familiar objects...
January 28, 2019: Attention, Perception & Psychophysics
Yoko Higuchi, Satoshi Inoue, Terumasa Endo, Takatsune Kumada
Motion is an important factor in visual information processing. Studies have shown that global optic flow guides attention, but it remains unclear whether this attentional guidance occurs regardless of top-down attentional control settings for another endogenous cue. To address this issue, we developed a visual search paradigm in which a task-irrelevant optic flow starts and stops prior to a visual search task itself. Participants first observed an initial optic flow motion pattern for a brief period; next, they searched a static display for a target amongst multiple distractors...
January 25, 2019: Attention, Perception & Psychophysics
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"