journal
https://read.qxmd.com/read/38118461/spatial-sensory-references-for-vestibular-self-motion-perception
#1
JOURNAL ARTICLE
Silvia Zanchi, Luigi F Cuturi, Giulio Sandini, Monica Gori, Elisa R Ferrè
While navigating through the surroundings, we constantly rely on inertial vestibular signals for self-motion along with visual and acoustic spatial references from the environment. However, the interaction between inertial cues and environmental spatial references is not yet fully understood. Here we investigated whether vestibular self-motion sensitivity is influenced by sensory spatial references. Healthy participants were administered a Vestibular Self-Motion Detection Task in which they were asked to detect vestibular self-motion sensations induced by low-intensity Galvanic Vestibular Stimulation...
December 20, 2023: Multisensory Research
https://read.qxmd.com/read/38113917/cross-modal-contributions-to-episodic-memory-for-voices
#2
JOURNAL ARTICLE
Joshua R Tatz, Zehra F Peynircioğlu
Multisensory context often facilitates perception and memory. In fact, encoding items within a multisensory context can improve memory even on strictly unisensory tests (i.e., when the multisensory context is absent). Prior studies that have consistently found these multisensory facilitation effects have largely employed multisensory contexts in which the stimuli were meaningfully related to the items targeting for remembering (e.g., pairing canonical sounds and images). Other studies have used unrelated stimuli as multisensory context...
December 20, 2023: Multisensory Research
https://read.qxmd.com/read/38018137/stationary-haptic-stimuli-do-not-produce-ocular-accommodation-in-most-individuals
#3
JOURNAL ARTICLE
Lawrence R Stark, Kim Shiraishi, Tyler Sommerfeld
This study aimed to determine the extent to which haptic stimuli can influence ocular accommodation, either alone or in combination with vision. Accommodation was measured objectively in 15 young adults as they read stationary targets containing Braille letters. These cards were presented at four distances in the range 20-50 cm. In the Touch condition, the participant read by touch with their dominant hand in a dark room. Afterward, they estimated card distance with their non-dominant hand. In the Vision condition, they read by sight binocularly without touch in a lighted room...
November 28, 2023: Multisensory Research
https://read.qxmd.com/read/37963487/reflections-on-cross-modal-correspondences-current-understanding-and-issues-for-future-research
#4
REVIEW
Kosuke Motoki, Lawrence E Marks, Carlos Velasco
The past two decades have seen an explosion of research on cross-modal correspondences. Broadly speaking, this term has been used to encompass associations between and among features, dimensions, or attributes across the senses. There has been an increasing interest in this topic amongst researchers from multiple fields (psychology, neuroscience, music, painting, environmental design, etc.) and, importantly, an increasing breadth of the topic's scope. Here, this narrative review aims to reflect on what cross-modal correspondences are, where they come from, and what underlies them...
November 10, 2023: Multisensory Research
https://read.qxmd.com/read/37907070/joint-contributions-of-auditory-proprioceptive-and-visual-cues-on-human-balance
#5
JOURNAL ARTICLE
Max Teaford, Zachary J Mularczyk, Alannah Gernon, Shauntelle Cannon, Megan Kobel, Daniel M Merfeld
One's ability to maintain their center of mass within their base of support (i.e., balance) is believed to be the result of multisensory integration. Much of the research in this literature has focused on integration of visual, vestibular, and proprioceptive cues. However, several recent studies have found evidence that auditory cues can impact balance control metrics. In the present study, we sought to better characterize the impact of auditory cues on narrow stance balance task performance with different combinations of visual stimuli (virtual and real world) and support surfaces (firm and compliant)...
October 27, 2023: Multisensory Research
https://read.qxmd.com/read/37907066/beyond-the-eye-multisensory-contributions-to-the-sensation-of-illusory-self-motion-vection
#6
REVIEW
Bernhard E Riecke, Brandy Murovec, Jennifer L Campos, Behrang Keshavarz
Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., 'train illusion') and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection, the role of nonvisual sensory inputs, such as auditory, biomechanical, tactile, and vestibular cues, have recently gained more attention...
October 27, 2023: Multisensory Research
https://read.qxmd.com/read/37903498/investigating-the-role-of-leading-sensory-modality-and-autistic-traits-in-the-visual-tactile-temporal-binding-window
#7
JOURNAL ARTICLE
Michelle K Huntley, An Nguyen, Matthew A Albrecht, Welber Marinovic
Our ability to integrate multisensory information depends on processes occurring during the temporal binding window. There is limited research investigating the temporal binding window for visual-tactile integration and its relationship with autistic traits, sensory sensitivity, and unusual sensory experiences. We measured the temporal binding window for visual-tactile integration in 27 neurotypical participants who completed a simultaneity judgement task and three questionnaires: the Autism Quotient, the Glasgow Sensory Questionnaire, and the Multi-Modality Unusual Sensory Experiences Questionnaire...
October 18, 2023: Multisensory Research
https://read.qxmd.com/read/37903493/motor-signals-mediate-stationarity-perception
#8
JOURNAL ARTICLE
Savannah Halow, James Liu, Eelke Folmer, Paul R MacNeilage
Head movement relative to the stationary environment gives rise to congruent vestibular and visual optic-flow signals. The resulting perception of a stationary visual environment, referred to herein as stationarity perception, depends on mechanisms that compare visual and vestibular signals to evaluate their congruence. Here we investigate the functioning of these mechanisms and their dependence on fixation behavior as well as on the active versus passive nature of the head movement. Stationarity perception was measured by modifying the gain on visual motion relative to head movement on individual trials and asking subjects to report whether the gain was too low or too high...
October 13, 2023: Multisensory Research
https://read.qxmd.com/read/37775097/subjective-audibility-modulates-the-susceptibility-to-sound-induced-flash-illusion-effect-of-loudness-and-auditory-masking
#9
JOURNAL ARTICLE
Yuki Ito, Hanaka Matsumoto, Kohta I Kobayasi
When a brief flash is presented along with two brief sounds, the single flash is often perceived as two flashes. This phenomenon is called a sound-induced flash illusion, in which the auditory sense, with its relatively higher reliability in providing temporal information, modifies the visual perception. Decline of audibility due to hearing impairment is known to make subjects less susceptible to the flash illusion. However, the effect of decline of audibility on susceptibility to the illusion has not been directly investigated in subjects with normal hearing...
September 29, 2023: Multisensory Research
https://read.qxmd.com/read/37758236/from-the-outside-in-asmr-is-characterised-by-reduced-interoceptive-accuracy-but-higher-sensation-seeking
#10
JOURNAL ARTICLE
Giulia L Poerio, Fatimah Osman, Jennifer Todd, Jasmeen Kaur, Lovell Jones, Flavia Cardini
Autonomous Sensory Meridian Response (ASMR) is a complex sensory-perceptual phenomenon characterised by relaxing and pleasurable scalp-tingling sensations. The ASMR trait is nonuniversal, thought to have developmental origins, and a prevalence rate of 20%. Previous theory and research suggest that trait ASMR may be underlined by atypical multisensory perception from both interoceptive and exteroceptive modalities. In this study, we examined whether ASMR responders differed from nonresponders in interoceptive accuracy and multisensory processing style...
September 27, 2023: Multisensory Research
https://read.qxmd.com/read/37734735/exploring-crossmodal-associations-between-sound-and-the-chemical-senses-a-systematic-review-including-interactive-visualizations
#11
REVIEW
Brayan Rodríguez, Luis H Reyes, Felipe Reinoso-Carvalho
This is the first systematic review that focuses on the influence of product-intrinsic and extrinsic sounds on the chemical senses involving both food and aroma stimuli. This review has a particular focus on all methodological details (stimuli, experimental design, dependent variables, and data analysis techniques) of 95 experiments, published in 83 publications from 2012 to 2023. 329 distinct crossmodal auditory-chemosensory associations were uncovered across this analysis. What is more, instead of relying solely on static figures and tables, we created a first-of-its-kind comprehensive Power BI dashboard (interactive data visualization tool by Microsoft) on methodologies and significant findings, incorporating various filters and visualizations allowing readers to explore statistics for specific subsets of experiments...
September 21, 2023: Multisensory Research
https://read.qxmd.com/read/37673794/the-audiovisual-mismatch-negativity-in-predictive-and-non-predictive-speech-stimuli-in-older-adults-with-and-without-hearing-loss
#12
JOURNAL ARTICLE
Melissa Randazzo, Paul J Smith, Ryan Priefer, Deborah R Senzer, Karen Froud
Adults with aging-related hearing loss (ARHL) experience adaptive neural changes to optimize their sensory experiences; for example, enhanced audiovisual (AV) and predictive processing during speech perception. The mismatch negativity (MMN) event-related potential is an index of central auditory processing; however, it has not been explored as an index of AV and predictive processing in adults with ARHL. In a pilot study we examined the AV MMN in two conditions of a passive oddball paradigm - one AV condition in which the visual aspect of the stimulus can predict the auditory percept and one AV control condition in which the visual aspect of the stimulus cannot predict the auditory percept...
September 6, 2023: Multisensory Research
https://read.qxmd.com/read/37582512/synergistic-combination-of-visual-features-in-vision-taste-crossmodal-correspondences
#13
REVIEW
Byron P Lee, Charles Spence
There has been a rapid recent growth in academic attempts to summarise, understand, and predict the taste profile matching complex images that incorporate multiple visual design features. While there is now ample research to document the patterns of vision-taste correspondences involving individual visual features (such as colour and shape curvilinearity in isolation), little is known about the taste associations that may be primed when multiple visual features are presented simultaneously. This narrative historical review therefore presents an overview of the research that has examined, or provided insights into, the interaction of graphic elements in taste correspondences involving colour, shape attributes, texture, and other visual features...
August 14, 2023: Multisensory Research
https://read.qxmd.com/read/37582513/motion-binding-property-contributes-to-accurate-temporal-order-perception-in-audiovisual-synchrony
#14
JOURNAL ARTICLE
Jinhwan Kwon, Yoshihiro Miyake
Temporal perception in multisensory processing is important for an accurate and efficient understanding of the physical world. In general, it is executed in a dynamic environment in our daily lives. In particular, the motion-binding property is important for correctly identifying moving objects in the external environment. However, how this property affects multisensory temporal perception remains unclear. We investigate whether the motion-binding property influences audiovisual temporal integration. The study subjects performed four types of temporal-order judgment (TOJ) task experiments using three types of perception...
August 3, 2023: Multisensory Research
https://read.qxmd.com/read/37582516/visuo-tactile-congruence-leads-to-stronger-illusion-than-visuo-proprioceptive-congruence-a-quantitative-and-qualitative-approach-to-explore-the-rubber-hand-illusion
#15
JOURNAL ARTICLE
Roxane L Bartoletti, Ambre Denis-Noël, Séraphin Boulvert, Marie Lopez, Sylvane Faure, Xavier Corveleyn
The Rubber Hand Illusion (RHI) arises through multisensory congruence and informative cues from the most relevant sensory channels. Some studies have explored the RHI phenomenon on the fingers, but none of them modulated the congruence of visuo-tactile and visuo-proprioceptive information by changing the posture of the fingers. This study hypothesizes that RHI induction is possible despite a partial visuo-proprioceptive or visuo-tactile incongruence. With quantitative and qualitative measures, we observed that gradual induction of the sense of body ownership depends on the congruence of multisensory information, with an emphasis on visuo-tactile information rather than visuo-proprioceptive signals...
June 20, 2023: Multisensory Research
https://read.qxmd.com/read/37582519/what-makes-the-detection-of-movement-different-within-the-autistic-traits-spectrum-evidence-from-the-audiovisual-depth-paradigm
#16
JOURNAL ARTICLE
Rachel Poulain, Magali Batty, Céline Cappe
Atypical sensory processing is now considered a diagnostic feature of autism. Although multisensory integration (MSI) may have cascading effects on the development of higher-level skills such as socio-communicative functioning, there is a clear lack of understanding of how autistic individuals integrate multiple sensory inputs. Multisensory dynamic information is a more ecological construct than static stimuli, reflecting naturalistic sensory experiences given that our environment involves moving stimulation of more than one sensory modality at a time...
June 6, 2023: Multisensory Research
https://read.qxmd.com/read/37080552/association-between-body-tilt-and-egocentric-estimates-near-upright
#17
JOURNAL ARTICLE
Keisuke Tani, Shintaro Uehara, Satoshi Tanaka
The mechanisms underlying geocentric (orientations of an object or the body relative to 'gravity') and egocentric estimates (object orientation relative to the 'body') have each been examined; however, little is known regarding the association between these estimates, especially when the body is nearly upright. To address this, we conducted two psychophysical experiments. In Experiment 1, participants estimated the direction of a visual line (subjective visual vertical; SVV) and their own body relative to gravity (subjective body tilt; SBT) and the direction of a visual line relative to the body longitudinal axis (subjective visual body axis; SVBA) during a small-range whole-body roll tilt...
April 7, 2023: Multisensory Research
https://read.qxmd.com/read/37080553/explaining-visual-shape-taste-crossmodal-correspondences
#18
REVIEW
Charles Spence
A growing body of experimental research now demonstrates that neurologically normal individuals associate different taste qualities with design features such as curvature, symmetry, orientation, texture and movement. The form of everything from the food itself through to the curvature of the plateware on which it happens to be served, and from glassware to typeface, not to mention the shapes of/on food product packaging have all been shown to influence people's taste expectations, and, on occasion, also their taste/food experiences...
March 20, 2023: Multisensory Research
https://read.qxmd.com/read/37080554/off-vertical-body-orientation-delays-the-perceived-onset-of-visual-motion
#19
JOURNAL ARTICLE
William Chung, Michael Barnett-Cowan
The integration of vestibular, visual and body cues is a fundamental process in the perception of self-motion and is commonly experienced in an upright posture. However, when the body is tilted in an off-vertical orientation these signals are no longer aligned relative to the influence of gravity. In this study, the perceived timing of visual motion was examined in the presence of sensory conflict introduced by manipulating the orientation of the body, generating a mismatch between body and vestibular cues due to gravity and creating an ambiguous vestibular signal of either head tilt or translation...
March 8, 2023: Multisensory Research
https://read.qxmd.com/read/37080555/metacognition-and-causal-inference-in-audiovisual-speech
#20
JOURNAL ARTICLE
Faith Kimmet, Samantha Pedersen, Victoria Cardenas, Camila Rubiera, Grey Johnson, Addison Sans, Matthew Baldwin, Brian Odegaard
In multisensory environments, our brains perform causal inference to estimate which sources produce specific sensory signals. Decades of research have revealed the dynamics which underlie this process of causal inference for multisensory (audiovisual) signals, including how temporal, spatial, and semantic relationships between stimuli influence the brain's decision about whether to integrate or segregate. However, presently, very little is known about the relationship between metacognition and multisensory integration, and the characteristics of perceptual confidence for audiovisual signals...
February 23, 2023: Multisensory Research
journal
journal
47633
1
2
Fetch more papers »
Fetching more papers... Fetching...
Remove bar
Read by QxMD icon Read
×

Save your favorite articles in one place with a free QxMD account.

×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"

We want to hear from doctors like you!

Take a second to answer a survey question.