We have located links that may give you full text access.
Extending the Architecture of Language From a Multimodal Perspective.
Topics in Cognitive Science 2024 March 18
Language is inherently multimodal. In spoken languages, combined spoken and visual signals (e.g., co-speech gestures) are an integral part of linguistic structure and language representation. This requires an extension of the parallel architecture, which needs to include the visual signals concomitant to speech. We present the evidence for the multimodality of language. In addition, we propose that distributional semantics might provide a format for integrating speech and co-speech gestures in a common semantic representation.
Full text links
Related Resources
Trending Papers
Obesity pharmacotherapy in older adults: a narrative review of evidence.International Journal of Obesity 2024 May 7
SGLT2 Inhibitors in Kidney Diseases-A Narrative Review.International Journal of Molecular Sciences 2024 May 2
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app