journal
https://read.qxmd.com/read/36731523/assessing-the-effects-of-exercise-cognitive-demand-and-rest-on-audiovisual-multisensory-processing-in-older%C3%A2-adults-a-pilot-study
#21
JOURNAL ARTICLE
Aysha Basharat, Michael Barnett-Cowan
A single bout of aerobic exercise is related to positive changes in higher-order cognitive function among older adults; however, the impact of aerobic exercise on multisensory processing remains unclear. Here we assessed the effects of a single bout of aerobic exercise on commonly utilized tasks that measure audiovisual multisensory processing: response time (RT), simultaneity judgements (SJ), and temporal-order judgements (TOJ), in a pilot study. To our knowledge this is the first effort to investigate the effects of three well-controlled intervention conditions on multisensory processing: resting, completing a cognitively demanding task, and performing aerobic exercise for 20 minutes...
January 24, 2023: Multisensory Research
https://read.qxmd.com/read/36731524/neural-correlates-of-audiovisual-speech-processing-in-autistic-and-non-autistic-youth
#22
JOURNAL ARTICLE
Kacie Dunham, Alisa Zoltowski, Jacob I Feldman, Samona Davis, Baxter Rogers, Michelle D Failla, Mark T Wallace, Carissa J Cascio, Tiffany G Woynaroski
Autistic youth demonstrate differences in processing multisensory information, particularly in temporal processing of multisensory speech. Extensive research has identified several key brain regions for multisensory speech processing in non-autistic adults, including the superior temporal sulcus (STS) and insula, but it is unclear to what extent these regions are involved in temporal processing of multisensory speech in autistic youth. As a first step in exploring the neural substrates of multisensory temporal processing in this clinical population, we employed functional magnetic resonance imaging (fMRI) with a simultaneity-judgment audiovisual speech task...
January 19, 2023: Multisensory Research
https://read.qxmd.com/read/36731525/audio-visual-interference-during-motion-discrimination%C3%A2-in-starlings
#23
JOURNAL ARTICLE
Gesa Feenders, Georg M Klump
Motion discrimination is essential for animals to avoid collisions, to escape from predators, to catch prey or to communicate. Although most terrestrial vertebrates can benefit by combining concurrent stimuli from sound and vision to obtain a most salient percept of the moving object, there is little research on the mechanisms involved in such cross-modal motion discrimination. We used European starlings as a model with a well-studied visual and auditory system. In a behavioural motion discrimination task with visual and acoustic stimuli, we investigated the effects of cross-modal interference and attentional processes...
January 17, 2023: Multisensory Research
https://read.qxmd.com/read/36731526/can-we-train-multisensory-integration-in-adults-a%C3%A2-systematic-review
#24
REVIEW
Jessica O'Brien, Amy Mason, Jason Chan, Annalisa Setti
The ability to efficiently combine information from different senses is an important perceptual process that underpins much of our daily activities. This process, known as multisensory integration, varies from individual to individual, and is affected by the ageing process, with impaired processing associated with age-related conditions, including balance difficulties, mild cognitive impairment and cognitive decline. Impaired multisensory perception has also been associated with a range of neurodevelopmental conditions, where novel intervention approaches are actively sought, for example dyslexia and autism...
January 13, 2023: Multisensory Research
https://read.qxmd.com/read/36731528/the-impact-of-singing-on-visual-and-multisensory-speech-perception-in-children-on-the-autism-spectrum
#25
JOURNAL ARTICLE
Jacob I Feldman, Alexander Tu, Julie G Conrad, Wayne Kuang, Pooja Santapuram, Tiffany G Woynaroski
Autistic children show reduced multisensory integration of audiovisual speech stimuli in response to the McGurk illusion. Previously, it has been shown that adults can integrate sung McGurk tokens. These sung speech tokens offer more salient visual and auditory cues, in comparison to the spoken tokens, which may increase the identification and integration of visual speech cues in autistic children. Forty participants (20 autism, 20 non-autistic peers) aged 7-14 completed the study. Participants were presented with speech tokens in four modalities: auditory-only, visual-only, congruent audiovisual, and incongruent audiovisual (i...
December 30, 2022: Multisensory Research
https://read.qxmd.com/read/36731527/-tasting-imagination-what-role-chemosensory-mental-imagery-in-multisensory-flavour-perception
#26
REVIEW
Charles Spence
A number of perplexing phenomena in the area of olfactory/flavour perception may fruitfully be explained by the suggestion that chemosensory mental imagery can be triggered automatically by perceptual inputs. In particular, the disconnect between the seemingly limited ability of participants in chemosensory psychophysics studies to distinguish more than two or three odorants in mixtures and the rich and detailed flavour descriptions that are sometimes reported by wine experts; the absence of awareness of chemosensory loss in many elderly individuals; and the insensitivity of the odour-induced taste enhancement (OITE) effect to the mode of presentation of olfactory stimuli (i...
December 30, 2022: Multisensory Research
https://read.qxmd.com/read/36731529/crossmodal-texture-perception-is-illumination-dependent
#27
JOURNAL ARTICLE
Karina Kangur, Martin Giesel, Julie M Harris, Constanze Hesse
Visually perceived roughness of 3D textures varies with illumination direction. Surfaces appear rougher when the illumination angle is lowered resulting in a lack of roughness constancy. Here we aimed to investigate whether the visual system also relies on illumination-dependent features when judging roughness in a crossmodal matching task or whether it can access illumination-invariant surface features that can also be evaluated by the tactile system. Participants ( N = 32) explored an abrasive paper of medium physical roughness either tactually, or visually under two different illumination conditions (top vs oblique angle)...
December 28, 2022: Multisensory Research
https://read.qxmd.com/read/36731530/body-pitch-together-with-translational-body-motion-biases-the-subjective-haptic-vertical
#28
JOURNAL ARTICLE
Chia-Huei Tseng, Hiu Mei Chow, Lothar Spillmann, Matt Oxner, Kenzo Sakurai
Accurate perception of verticality is critical for postural maintenance and successful physical interaction with the world. Although previous research has examined the independent influences of body orientation and self-motion under well-controlled laboratory conditions, these factors are constantly changing and interacting in the real world. In this study, we examine the subjective haptic vertical in a real-world scenario. Here, we report a bias of verticality perception in a field experiment on the Hong Kong Peak Tram as participants traveled on a slope ranging from 6° to 26°...
December 20, 2022: Multisensory Research
https://read.qxmd.com/read/36731531/dynamic-weighting-of-time-varying-visual-and-auditory-evidence-during-multisensory-decision-making
#29
JOURNAL ARTICLE
Rosanne R M Tuip, Wessel van der Ham, Jeannette A M Lorteije, Filip Van Opstal
Perceptual decision-making in a dynamic environment requires two integration processes: integration of sensory evidence from multiple modalities to form a coherent representation of the environment, and integration of evidence across time to accurately make a decision. Only recently studies started to unravel how evidence from two modalities is accumulated across time to form a perceptual decision. One important question is whether information from individual senses contributes equally to multisensory decisions...
December 1, 2022: Multisensory Research
https://read.qxmd.com/read/36731532/prior-exposure-to-dynamic-visual-displays-reduces-vection-onset-latency
#30
JOURNAL ARTICLE
Jing Ni, Hiroyuki Ito, Masaki Ogawa, Shoji Sunaga, Stephen Palmisano
While compelling illusions of self-motion (vection) can be induced purely by visual motion, they are rarely experienced immediately. This vection onset latency is thought to represent the time required to resolve sensory conflicts between the stationary observer's visual and nonvisual information about self-motion. In this study, we investigated whether manipulations designed to increase the weightings assigned to vision (compared to the nonvisual senses) might reduce vection onset latency. We presented two different types of visual priming displays directly before our main vection-inducing displays: (1) 'random motion' priming displays - designed to pre-activate general, as opposed to self-motion-specific, visual motion processing systems; and (2) 'dynamic no-motion' priming displays - designed to stimulate vision, but not generate conscious motion perceptions...
November 16, 2022: Multisensory Research
https://read.qxmd.com/read/36731533/can-the-perceived-timing-of-multisensory-events-predict-cybersickness
#31
JOURNAL ARTICLE
Ogai Sadiq, Michael Barnett-Cowan
Humans are constantly presented with rich sensory information that the central nervous system (CNS) must process to form a coherent perception of the self and its relation to its surroundings. While the CNS is efficient in processing multisensory information in natural environments, virtual reality (VR) poses challenges of temporal discrepancies that the CNS must solve. These temporal discrepancies between information from different sensory modalities leads to inconsistencies in perception of the virtual environment which often causes cybersickness...
October 24, 2022: Multisensory Research
https://read.qxmd.com/read/36084933/relating-sound-and-sight-in-simulated-environments
#32
JOURNAL ARTICLE
Kevin Y Tsang, Damien J Mannion
The auditory signals at the ear can be affected by components arriving both directly from a sound source and indirectly via environmental reverberation. Previous studies have suggested that the perceptual separation of these contributions can be aided by expectations of likely reverberant qualities. Here, we investigated whether vision can provide information about the auditory properties of physical locations that could also be used to develop such expectations. We presented participants with audiovisual stimuli derived from 10 simulated real-world locations via a head-mounted display (HMD; n = 44) or a web-based ( n = 60) delivery method...
September 8, 2022: Multisensory Research
https://read.qxmd.com/read/36057431/something-in-the-sway-effects-of-the-shepard-risset-glissando-on-postural-activity-and-vection
#33
JOURNAL ARTICLE
Rebecca A Mursic, Stephen Palmisano
This study investigated claims of disrupted equilibrium when listening to the Shepard-Risset glissando (which creates an auditory illusion of perpetually ascending/descending pitch). During each trial, 23 participants stood quietly on a force plate for 90 s with their eyes either open or closed (30 s pre-sound, 30 s of sound and 30 s post-sound). Their centre of foot pressure (CoP) was continuously recorded during the trial and a verbal measure of illusory self-motion (i.e., vection) was obtained directly afterwards...
September 1, 2022: Multisensory Research
https://read.qxmd.com/read/35998899/odor-induced-taste-enhancement-is-specific-to-naturally-occurring-temporal-order-and-the-respiration-phase
#34
JOURNAL ARTICLE
Shogo Amano, Takuji Narumi, Tatsu Kobayakawa, Masayoshi Kobayashi, Masahiko Tamura, Yuko Kusakabe, Yuji Wada
Interaction between odor and taste information creates flavor perception. There are many possible determinants of the interaction between odor and taste, one of which may be the somatic sensations associated with breathing. We assumed that a smell stimulus accompanied by inhaling or exhaling enhances taste intensity if the order is congruent with natural drinking. To present an olfactory stimulus from the identical location during inhalation and exhalation, we blocked the gap between the tube presenting the olfactory stimulus and the nostril...
August 23, 2022: Multisensory Research
https://read.qxmd.com/read/35985650/exploring-group-differences-in-the-crossmodal-correspondences
#35
REVIEW
Charles Spence
There has been a rapid growth of interest amongst researchers in the cross-modal correspondences in recent years. In part, this has resulted from the emerging realization of the important role that the correspondences can sometimes play in multisensory integration. In turn, this has led to an interest in the nature of any differences between individuals, or rather, between groups of individuals, in the strength and/or consensuality of cross-modal correspondences that may be observed in both neurotypically normal groups cross-culturally, developmentally, and across various special populations (including those who have lost a sense, as well as those with autistic tendencies)...
August 9, 2022: Multisensory Research
https://read.qxmd.com/read/35985651/size-and-quality-of-drawings-made-by-adults-under-visual-and-haptic-control
#36
JOURNAL ARTICLE
Magdalena Szubielska, Paweł Augustynowicz, Delphine Picard
The aim of this study was twofold. First, our objective was to test the influence of an object's actual size (size rank) on the drawn size of the depicted object. We tested the canonical size effect (i.e., drawing objects larger in the physical world as larger) in four drawing conditions - two perceptual conditions (blindfolded or sighted) crossed with two materials (paper or special foil for producing embossed drawings). Second, we investigated whether drawing quality (we analysed both the local and global criteria of quality) depends on drawing conditions...
July 1, 2022: Multisensory Research
https://read.qxmd.com/read/35985652/crossmodal-correspondence-between-music-and-ambient-color-is-mediated-by-emotion
#37
JOURNAL ARTICLE
Pia Hauck, Christoph von Castell, Heiko Hecht
The quality of a concert hall primarily depends on its acoustics. But does visual input also have an impact on musical enjoyment? Does the color of ambient lighting modulate the perceived music quality? And are certain colors perceived to fit better than others with a given music piece? To address these questions, we performed three within-subjects experiments. We carried out two pretests to select four music pieces differing in tonality and genre, and 14 lighting conditions of varying hue, brightness, and saturation...
June 8, 2022: Multisensory Research
https://read.qxmd.com/read/35985653/investigating-the-crossmodal-influence-of-odour-on-the-visual-perception-of-facial-attractiveness-and-age
#38
JOURNAL ARTICLE
Yi-Chuan Chen, Charles Spence
We report two experiments designed to investigate whether the presentation of a range of pleasant fragrances, containing both floral and fruity notes, would modulate people's judgements of the facial attractiveness (Experiment 1) and age (Experiment 2) of a selection of typical female faces varying in age in the range 20-69 years. In Experiment 1, male participants rated the female faces as less attractive when presented with an unpleasant fragrance compared to clean air. The rated attractiveness of the female faces was lower when the participants rated the unpleasant odour as having a lower attractiveness and pleasantness, and a higher intensity...
May 31, 2022: Multisensory Research
https://read.qxmd.com/read/35523736/do-congruent-auditory-stimuli-facilitate-visual-search-in-dynamic-environments-an-experimental-study-based-on-multisensory-interaction
#39
JOURNAL ARTICLE
Xiaofang Sun, Pin-Hsuan Chen, Pei-Luen Patrick Rau
The purpose of this study was to investigate the cue congruency effect of auditory stimuli during visual search in dynamic environments. Twenty-eight participants were recruited to conduct a visual search experiment. The experiment applied auditory stimuli to understand whether they could facilitate visual search in different types of background. Additionally, target location and target orientation were manipulated to clarify their influences on visual search. Target location was related to horizontal visual search and target orientation was associated with visual search for an inverted target...
May 6, 2022: Multisensory Research
https://read.qxmd.com/read/35985654/multisensory-perception-and-learning-linking-pedagogy-psychophysics-and-human-computer-interaction
#40
REVIEW
Monica Gori, Sara Price, Fiona N Newell, Nadia Berthouze, Gualtiero Volpe
In this review, we discuss how specific sensory channels can mediate the learning of properties of the environment. In recent years, schools have increasingly been using multisensory technology for teaching. However, it still needs to be sufficiently grounded in neuroscientific and pedagogical evidence. Researchers have recently renewed understanding around the role of communication between sensory modalities during development. In the current review, we outline four principles that will aid technological development based on theoretical models of multisensory development and embodiment to foster in-depth, perceptual, and conceptual learning of mathematics...
April 19, 2022: Multisensory Research
journal
journal
47633
2
3
Fetch more papers »
Fetching more papers... Fetching...
Remove bar
Read by QxMD icon Read
×

Save your favorite articles in one place with a free QxMD account.

×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"

We want to hear from doctors like you!

Take a second to answer a survey question.