journal
https://read.qxmd.com/read/35985650/exploring-group-differences-in-the-crossmodal-correspondences
#41
REVIEW
Charles Spence
There has been a rapid growth of interest amongst researchers in the cross-modal correspondences in recent years. In part, this has resulted from the emerging realization of the important role that the correspondences can sometimes play in multisensory integration. In turn, this has led to an interest in the nature of any differences between individuals, or rather, between groups of individuals, in the strength and/or consensuality of cross-modal correspondences that may be observed in both neurotypically normal groups cross-culturally, developmentally, and across various special populations (including those who have lost a sense, as well as those with autistic tendencies)...
August 9, 2022: Multisensory Research
https://read.qxmd.com/read/35985651/size-and-quality-of-drawings-made-by-adults-under-visual-and-haptic-control
#42
JOURNAL ARTICLE
Magdalena Szubielska, Paweł Augustynowicz, Delphine Picard
The aim of this study was twofold. First, our objective was to test the influence of an object's actual size (size rank) on the drawn size of the depicted object. We tested the canonical size effect (i.e., drawing objects larger in the physical world as larger) in four drawing conditions - two perceptual conditions (blindfolded or sighted) crossed with two materials (paper or special foil for producing embossed drawings). Second, we investigated whether drawing quality (we analysed both the local and global criteria of quality) depends on drawing conditions...
July 1, 2022: Multisensory Research
https://read.qxmd.com/read/35985652/crossmodal-correspondence-between-music-and-ambient-color-is-mediated-by-emotion
#43
JOURNAL ARTICLE
Pia Hauck, Christoph von Castell, Heiko Hecht
The quality of a concert hall primarily depends on its acoustics. But does visual input also have an impact on musical enjoyment? Does the color of ambient lighting modulate the perceived music quality? And are certain colors perceived to fit better than others with a given music piece? To address these questions, we performed three within-subjects experiments. We carried out two pretests to select four music pieces differing in tonality and genre, and 14 lighting conditions of varying hue, brightness, and saturation...
June 8, 2022: Multisensory Research
https://read.qxmd.com/read/35985653/investigating-the-crossmodal-influence-of-odour-on-the-visual-perception-of-facial-attractiveness-and-age
#44
JOURNAL ARTICLE
Yi-Chuan Chen, Charles Spence
We report two experiments designed to investigate whether the presentation of a range of pleasant fragrances, containing both floral and fruity notes, would modulate people's judgements of the facial attractiveness (Experiment 1) and age (Experiment 2) of a selection of typical female faces varying in age in the range 20-69 years. In Experiment 1, male participants rated the female faces as less attractive when presented with an unpleasant fragrance compared to clean air. The rated attractiveness of the female faces was lower when the participants rated the unpleasant odour as having a lower attractiveness and pleasantness, and a higher intensity...
May 31, 2022: Multisensory Research
https://read.qxmd.com/read/35523736/do-congruent-auditory-stimuli-facilitate-visual-search-in-dynamic-environments-an-experimental-study-based-on-multisensory-interaction
#45
JOURNAL ARTICLE
Xiaofang Sun, Pin-Hsuan Chen, Pei-Luen Patrick Rau
The purpose of this study was to investigate the cue congruency effect of auditory stimuli during visual search in dynamic environments. Twenty-eight participants were recruited to conduct a visual search experiment. The experiment applied auditory stimuli to understand whether they could facilitate visual search in different types of background. Additionally, target location and target orientation were manipulated to clarify their influences on visual search. Target location was related to horizontal visual search and target orientation was associated with visual search for an inverted target...
May 6, 2022: Multisensory Research
https://read.qxmd.com/read/35985654/multisensory-perception-and-learning-linking-pedagogy-psychophysics-and-human-computer-interaction
#46
REVIEW
Monica Gori, Sara Price, Fiona N Newell, Nadia Berthouze, Gualtiero Volpe
In this review, we discuss how specific sensory channels can mediate the learning of properties of the environment. In recent years, schools have increasingly been using multisensory technology for teaching. However, it still needs to be sufficiently grounded in neuroscientific and pedagogical evidence. Researchers have recently renewed understanding around the role of communication between sensory modalities during development. In the current review, we outline four principles that will aid technological development based on theoretical models of multisensory development and embodiment to foster in-depth, perceptual, and conceptual learning of mathematics...
April 19, 2022: Multisensory Research
https://read.qxmd.com/read/35393374/evaluating-the-effect-of-semantic-congruency-and-valence-on-multisensory-integration
#47
JOURNAL ARTICLE
Elyse Letts, Aysha Basharat, Michael Barnett-Cowan
Previous studies have found that semantics, the higher-level meaning of stimuli, can impact multisensory integration; however, less is known about the effect of valence, an affective response to stimuli. This study investigated the effects of both semantic congruency and valence of non-speech audiovisual stimuli on multisensory integration via response time (RT) and temporal-order judgement (TOJ) tasks [assessing processing speed (RT), Point of Subjective Simultaneity (PSS), and time window when multisensory stimuli are likely to be perceived as simultaneous (temporal binding window; TBW)]...
April 7, 2022: Multisensory Research
https://read.qxmd.com/read/35477696/influence-of-sensory-conflict-on-perceived-timing-of-passive-rotation-in-virtual-reality
#48
JOURNAL ARTICLE
William Chung, Michael Barnett-Cowan
Integration of incoming sensory signals from multiple modalities is central in the determination of self-motion perception. With the emergence of consumer virtual reality (VR), it is becoming increasingly common to experience a mismatch in sensory feedback regarding motion when using immersive displays. In this study, we explored whether introducing various discrepancies between the vestibular and visual motion would influence the perceived timing of self-motion. Participants performed a series of temporal-order judgements between an auditory tone and a passive whole-body rotation on a motion platform accompanied by visual feedback using a virtual environment generated through a head-mounted display...
April 5, 2022: Multisensory Research
https://read.qxmd.com/read/36734016/book-review
#49
REVIEW
Adam J Reeves
No abstract text is available yet for this article.
March 9, 2022: Multisensory Research
https://read.qxmd.com/read/35303717/book-review
#50
REVIEW
Adam J Reeves
No abstract text is available yet for this article.
March 9, 2022: Multisensory Research
https://read.qxmd.com/read/35276676/book-review
#51
REVIEW
Adam J Reeves
No abstract text is available yet for this article.
March 9, 2022: Multisensory Research
https://read.qxmd.com/read/35263712/influence-of-tactile-flow-on-visual-heading-perception
#52
JOURNAL ARTICLE
Lisa Rosenblum, Elisa Grewe, Jan Churan, Frank Bremmer
The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual-vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention...
March 9, 2022: Multisensory Research
https://read.qxmd.com/read/35065535/impacts-of-rotation-axis-and-frequency-on-vestibular-perceptual-thresholds
#53
JOURNAL ARTICLE
Andrew R Wagner, Megan J Kobel, Daniel M Merfeld
In an effort to characterize the factors influencing the perception of self-motion rotational cues, vestibular self-motion perceptual thresholds were measured in 14 subjects for rotations in the roll and pitch planes, as well as in the planes aligned with the anatomic orientation of the vertical semicircular canals (i.e., left anterior, right posterior; LARP, and right anterior, left posterior; RALP). To determine the multisensory influence of concurrent otolith cues, within each plane of motion, thresholds were measured at four discrete frequencies for rotations about earth-horizontal (i...
January 5, 2022: Multisensory Research
https://read.qxmd.com/read/35065537/the-effects-of-mandarin-chinese-lexical-tones-in-sound-shape-and-sound-size-correspondences
#54
JOURNAL ARTICLE
Yen-Han Chang, Mingxue Zhao, Yi-Chuan Chen, Pi-Chun Huang
Crossmodal correspondences refer to when specific domains of features in different sensory modalities are mapped. We investigated how vowels and lexical tones drive sound-shape (rounded or angular) and sound-size (large or small) mappings among native Mandarin Chinese speakers. We used three vowels (/i/, /u/, and /a/), and each vowel was articulated in four lexical tones. In the sound-shape matching, the tendency to match the rounded shape was decreased in the following order: /u/, /i/, and /a/. Tone 2 was more likely to be matched to the rounded pattern, whereas Tone 4 was more likely to be matched to the angular pattern...
December 30, 2021: Multisensory Research
https://read.qxmd.com/read/35065536/crossmodal-correspondence-between-auditory-timbre-and-visual-shape
#55
JOURNAL ARTICLE
Daniel Gurman, Colin R McCormick, Raymond M Klein
Crossmodal correspondences are defined as associations between crossmodal stimuli based on seemingly irrelevant stimulus features (i.e., bright shapes being associated with high-pitched sounds). There is a large body of research describing auditory crossmodal correspondences involving pitch and volume, but not so much involving auditory timbre, the character or quality of a sound. Adeli and colleagues (2014, Front. Hum. Neurosci. 8, 352) found evidence of correspondences between timbre and visual shape. The present study aimed to replicate Adeli et al...
December 30, 2021: Multisensory Research
https://read.qxmd.com/read/34936982/reducing-cybersickness-in-360-degree-virtual-reality
#56
JOURNAL ARTICLE
Iqra Arshad, Paulo De Mello, Martin Ender, Jason D McEwen, Elisa R Ferré
Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may trigger symptoms of cybersickness...
December 16, 2021: Multisensory Research
https://read.qxmd.com/read/34690111/imagine-your-crossed-hands-as-uncrossed-visual-imagery-impacts-the-crossed-hands-deficit
#57
JOURNAL ARTICLE
Lisa Lorentz, Kaian Unwalla, David I Shore
Successful interaction with our environment requires accurate tactile localization. Although we seem to localize tactile stimuli effortlessly, the processes underlying this ability are complex. This is evidenced by the crossed-hands deficit, in which tactile localization performance suffers when the hands are crossed. The deficit results from the conflict between an internal reference frame, based in somatotopic coordinates, and an external reference frame, based in external spatial coordinates. Previous evidence in favour of the integration model employed manipulations to the external reference frame (e...
October 22, 2021: Multisensory Research
https://read.qxmd.com/read/34638103/temporal-alignment-but-not-complexity-of-audiovisual-stimuli-influences-crossmodal-duration-percepts
#58
JOURNAL ARTICLE
Alexandra N Scurry, Daniela M Lemus, Fang Jiang
Reliable duration perception is an integral aspect of daily life that impacts everyday perception, motor coordination, and subjective passage of time. The Scalar Expectancy Theory (SET) is a common model that explains how an internal pacemaker, gated by an external stimulus-driven switch, accumulates pulses during sensory events and compares these accumulated pulses to a reference memory duration for subsequent duration estimation. Second-order mechanisms, such as multisensory integration (MSI) and attention, can influence this model and affect duration perception...
October 8, 2021: Multisensory Research
https://read.qxmd.com/read/34592713/serial-dependence-of-emotion-within-and-between-stimulus-sensory-modalities
#59
JOURNAL ARTICLE
Erik Van der Burg, Alexander Toet, Anne-Marie Brouwer, Jan B F Van Erp
How we perceive the world is not solely determined by what we sense at a given moment in time, but also by what we processed recently. Here we investigated whether such serial dependencies for emotional stimuli transfer from one modality to another. Participants were presented a random sequence of emotional sounds and images and instructed to rate the valence and arousal of each stimulus (Experiment 1). For both ratings, we conducted an intertrial analysis, based on whether the rating on the previous trial was low or high...
September 29, 2021: Multisensory Research
https://read.qxmd.com/read/34534967/developmental-changes-in-gaze-behavior-and-the-effects-of-auditory-emotion-word-priming-in-emotional-face-categorization
#60
JOURNAL ARTICLE
Michael Vesker, Daniela Bahn, Christina Kauschke, Gudrun Schwarzer
Social interactions often require the simultaneous processing of emotions from facial expressions and speech. However, the development of the gaze behavior used for emotion recognition, and the effects of speech perception on the visual encoding of facial expressions is less understood. We therefore conducted a word-primed face categorization experiment, where participants from multiple age groups (six-year-olds, 12-year-olds, and adults) categorized target facial expressions as positive or negative after priming with valence-congruent or -incongruent auditory emotion words, or no words at all...
September 16, 2021: Multisensory Research
journal
journal
47633
3
4
Fetch more papers »
Fetching more papers... Fetching...
Remove bar
Read by QxMD icon Read
×

Save your favorite articles in one place with a free QxMD account.

×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"

We want to hear from doctors like you!

Take a second to answer a survey question.