Access the full text.
Sign up today, get DeepDyve free for 14 days.
J. Vroomen, J. Stekelenburg (2011)
Perception of intersensory synchrony in audiovisual speech: Not that specialCognition, 118
K. Crowley, I. Colrain (2004)
A review of the evidence for P2 being an independent component process: age, sleep and modalityClinical Neurophysiology, 115
V. Ojanen, R. Möttönen, J. Pekkola, I. Jääskeläinen, R. Joensuu, T. Autti, M. Sams (2005)
Processing of audiovisual speech in Broca's areaNeuroImage, 25
G. Calvert, M. Brammer, E. Bullmore, R. Campbell, S. Iversen, A. David (1999)
Response amplification in sensory-specific cortices during crossmodal binding.Neuroreport, 10 12
V. Klucharev, R. Möttönen, M. Sams (2003)
Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception.Brain research. Cognitive brain research, 18 1
M. Giard, F. Péronnet (1999)
Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological StudyJournal of Cognitive Neuroscience, 11
Gabriella Musacchia, M. Sams, T. Nicol, N. Kraus (2005)
Seeing speech affects acoustic information processing in the human brainstemExperimental Brain Research, 168
Alexandra Fort, C. Delpuech, J. Pernier, M. Giard (2002)
Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans.Cerebral cortex, 12 10
J. Grèzes, N. Costes, J. Decety (1999)
The effects of learning and intention on the neural network involved in the perception of meaningless actions.Brain : a journal of neurology, 122 ( Pt 10)
S. Blakemore, D. Wolpert, C. Frith (2000)
Why can't you tickle yourself?Neuroreport, 11 11
G. Gratton, M. Coles, E. Donchin (1983)
A new method for off-line removal of ocular artifact.Electroencephalography and clinical neurophysiology, 55 4
D. Senkowski, D. Saint-Amour, S. Kelly, John Foxe (2007)
Multisensory processing of naturalistic objects in motion: A high-density electrical mapping and source estimation studyNeuroImage, 36
Waka Fujisaki, S. Shimojo, M. Kashino, S. Nishida (2004)
Recalibration of audiovisual simultaneityNature Neuroscience, 7
H. McGurk, J. MacDonald (1976)
Hearing lips and seeing voicesNature, 264
Jeremy Skipper, H. Nusbaum, S. Small (2005)
Listening to talking faces: motor cortical activation during speech perceptionNeuroImage, 25
Raphaël Meylan, M. Murray (2007)
Auditory–visual multisensory interactions attenuate subsequent visual responses in humansNeuroImage, 35
C. Colin, M. Radeau, A. Soquet, Didier Demolin, F. Colin, P. Deltenre (2002)
Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memoryClinical Neurophysiology, 113
T. Raij, K. Uutela, R. Hari (2000)
Audiovisual Integration of Letters in the Human BrainNeuron, 28
Virginie Wassenhove, K. Grant, D. Poeppel (2005)
Visual speech speeds up the neural processing of auditory speech.Proceedings of the National Academy of Sciences of the United States of America, 102 4
D. Callan, Jeffery Jones, K. Munhall, C. Kroos, A. Callan, E. Vatikiotis-Bateson (2004)
Multisensory Integration Sites Identified by Perception of Spatial Wavelet Filtered Visual Speech Gesture InformationJournal of Cognitive Neuroscience, 16
Valeria Occelli, C. Spence, M. Zampini (2011)
Audiotactile interactions in temporal perceptionPsychonomic Bulletin & Review, 18
M. Martikainen, K. Kaneko, R. Hari (2004)
Suppressed responses to self-triggered sounds in the human auditory cortex.Cerebral cortex, 15 3
M. Iacoboni, R. Woods, M. Brass, H. Bekkering, J. Mazziotta, G. Rizzolatti (1999)
Cortical mechanisms of human imitation.Science, 286 5449
Kasper Eskelund, J. Tuomainen, T. Andersen (2011)
Multistage audiovisual integration of speech: dissociating identification and detectionExperimental Brain Research, 208
A. Ghazanfar, Joost Maier, K. Hoffman, N. Logothetis (2005)
Multisensory Integration of Dynamic Faces and Voices in Rhesus Monkey Auditory CortexThe Journal of Neuroscience, 25
Christoph Lehmann, M. Herdener, F. Esposito, D. Hubl, F. Salle, K. Scheffler, D. Bach, A. Federspiel, R. Kretz, T. Dierks, E. Seifritz (2006)
Differential patterns of multisensory interactions in core and belt areas of human auditory cortexNeuroImage, 31
S. Molholm, W. Ritter, M. Murray, D. Javitt, C. Schroeder, John Foxe (2002)
Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study.Brain research. Cognitive brain research, 14 1
D. Guthrie, J. Buchwald (1991)
Significance testing of difference potentials.Psychophysiology, 28 2
V. Robert, B. Wrobel-Dautcourt, Y. Laprie, A. Bonneau (2007)
Auditory-visual Speech Processing 2005 (avsp'05) Inter Speaker Variability of Labial Coarticulation with the View of Developing a Formal Coarticulation Model for French
R. Möttönen, M. Schürmann, M. Sams (2004)
Time course of multisensory interactions during audiovisual speech perception in humans: a magnetoencephalographic studyNeuroscience Letters, 363
R. Möttönen, C. Krause, K. Tiippana, M. Sams (2002)
Processing of changes in visual speech in the human auditory cortex.Brain research. Cognitive brain research, 13 3
J. Vroomen, M. Keetels (2010)
Perception of intersensory synchrony: A tutorial reviewAttention, Perception, & Psychophysics, 72
K. Munhall, Y. Tohkura (1998)
Audiovisual gating and the time course of speech perception.The Journal of the Acoustical Society of America, 104 1
E. Schafer, M. Marcus (1974)
Self-stimulation alters human sensory brain responses.Science, 181 4095
Alexandra Fort, C. Delpuech, J. Pernier, M. Giard (2002)
Early auditory-visual interactions in human cortex during nonredundant target identification.Brain research. Cognitive brain research, 14 1
R. Menendez, S. Andino, G. Lantz, C. Michel, T. Landis (2004)
Noninvasive Localization of Electromagnetic Epileptic Activity. I. Method Descriptions and SimulationsBrain Topography, 14
J. Vroomen, M. Keetels, B. Gelder, P. Bertelson (2004)
Recalibration of temporal order perception by exposure to audio-visual asynchrony.Brain research. Cognitive brain research, 22 1
D. Callan, Jeffery Jones, K. Munhall, A. Callan, C. Kroos, E. Vatikiotis-Bateson (2003)
Neural processes underlying perceptual enhancement by visual speech gesturesNeuroReport, 14
M. Sams, R. Aulanko, M. Hämäläinen, R. Hari, O. Lounasmaa, S. Lu, J. Simola (1991)
Seeing speech: visual information from lip movements modifies activity in the human auditory cortexNeuroscience Letters, 127
J. Besle, Alexandra Fort, M. Giard (2004)
Interest and validity of the additive model in electrophysiological studies of multisensory interactionsCognitive Processing, 5
R. Franciotti, A. Brancucci, S. Penna, M. Onofrj, Luca Tommasi (2011)
Neuromagnetic responses reveal the cortical timing of audiovisual synchronyNeuroscience, 193
N. Russo, John Foxe, Alice Brandwein, Ted Altschuler, H. Gomes, S. Molholm (2010)
Multisensory processing in children with autism: high‐density electrical mapping of auditory–somatosensory integrationAutism Research, 3
C. Michel, M. Murray, G. Lantz, Sara Gonzalez, L. Spinelli, R. Peralta (2004)
EEG source imagingClinical Neurophysiology, 115
G. McCarthy, G. McCarthy, C. Wood, C. Wood (1985)
Scalp distributions of event-related potentials: an ambiguity associated with analysis of variance models.Electroencephalography and clinical neurophysiology, 62 3
L. Koski, A. Wohlschläger, H. Bekkering, R. Woods, Marie-Charlotte Dubeau, J. Mazziotta, M. Iacoboni (2002)
Modulation of motor and premotor activity during imitation of target-directed actions.Cerebral cortex, 12 8
J. Besle, Alexandra Fort, C. Delpuech, M. Giard (2004)
Bimodal speech: early suppressive visual effects in human auditory cortexEuropean Journal of Neuroscience, 20
S. Molholm, W. Ritter, D. Javitt, John Foxe (2004)
Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study.Cerebral cortex, 14 4
G. McCarthy, E. Donchin (1976)
The effects of temporal and event uncertainty in determining the waveforms of the auditory event related potential (ERP).Psychophysiology, 13 6
T. Heinks-Maldonado, D. Mathalon, Max Gray, J. Ford (2005)
Fine-tuning of auditory cortex during speech production.Psychophysiology, 42 2
Jessica Phillips-Silver, P. Toiviainen, N. Gosselin, Olivier Piché, S. Nozaradan, C. Palmer, I. Peretz (2011)
Born to dance but beat deaf: A new form of congenital amusiaNeuropsychologia, 49
V. Wassenhove, K. Grant, D. Poeppel (2007)
Temporal window of integration in auditory-visual speech perceptionNeuropsychologia, 45
D. Talsma, M. Woldorff (2005)
Selective Attention and Multisensory Integration: Multiple Phases of Effects on the Evoked Brain ActivityJournal of Cognitive Neuroscience, 17
Serkan Oray, Zhong-Lin Lu, M. Dawson (2002)
Modification of sudden onset auditory ERP by involuntary attention to visual stimuli.International journal of psychophysiology : official journal of the International Organization of Psychophysiology, 43 3
Sarah Jessen, S. Kotz (2011)
The temporal dynamics of processing emotions from vocal, facial, and bodily expressionsNeuroImage, 58
Barnaby Proctor, G. Meyer (2011)
Electrophysiological correlates of facial configuration and audio–visual congruency: evidence that face processing is a visual rather than a multisensory taskExperimental Brain Research, 213
R. Näätänen, T. Picton (1987)
The N1 wave of the human electric and magnetic response to sound: a review and an analysis of the component structure.Psychophysiology, 24 4
S. Blakemore, J. Decety (2001)
From the perception of action to the understanding of intentionNature Reviews Neuroscience, 2
L. Adler, E. Pachtman, R. Franks, M. Pecevich, M. Waldo, R. Freedman (1982)
Neurophysiological evidence for a defect in neuronal mechanisms involved in sensory gating in schizophrenia.Biological psychiatry, 17 6
W. Teder-Sälejärvi, J. McDonald, F. Russo, S. Hillyard (2002)
An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings.Brain research. Cognitive brain research, 14 1
SidneyA Simon (2008)
Merging of the SensesFrontiers in Neuroscience, 2
J. Mishra, Antígona Martínez, T. Sejnowski, S. Hillyard (2007)
Early Cross-Modal Interactions in Auditory and Visual Cortex Underlie a Sound-Induced Visual IllusionThe Journal of Neuroscience, 27
N. Barraclough, D. Xiao, C. Baker, M. Oram, D. Perrett (2005)
Integration of Visual and Auditory Information by Superior Temporal Sulcus Neurons Responsive to the Sight of ActionsJournal of Cognitive Neuroscience, 17
D. Barth, N. Goldberg, B. Brett, S. Di (1995)
The spatiotemporal organization of auditory, visual, and auditory-visual evoked potentials in rat cortexBrain Research, 678
A. Aoyama, H. Endo, S. Honda, T. Takeda (2006)
Modulation of early auditory processing by visually based sound predictionBrain Research, 1068
The neural activity of speech sound processing (the N1 component of the auditory ERP) can be suppressed if a speech sound is accompanied by concordant lip movements. Here we demonstrate that this audiovisual interaction is neither speech specific nor linked to humanlike actions but can be observed with artificial stimuli if their timing is made predictable. In Experiment 1, a pure tone synchronized with a deformation of a rectangle induced a smaller auditory N1 than auditory-only presentations if the temporal occurrence of this audiovisual event was made predictable by two moving disks that touched the rectangle. Local autoregressive average source estimation indicated that this audiovisual interaction may be related to integrative processing in auditory areas. When the moving disks did not precede the audiovisual stimulus—making the onset unpredictable—there was no N1 reduction. In Experiment 2, the predictability of the leading visual signal was manipulated by introducing a temporal asynchrony between the audiovisual event and the collision of moving disks. Audiovisual events occurred either at the moment, before (too “early”), or after (too “late”) the disks collided on the rectangle. When asynchronies varied from trial to trial—rendering the moving disks unreliable temporal predictors of the audiovisual event—the N1 reduction was abolished. These results demonstrate that the N1 suppression is induced by visual information that both precedes and reliably predicts audiovisual onset, without a necessary link to human action-related neural mechanisms.
Journal of Cognitive Neuroscience – MIT Press
Published: Jul 1, 2010
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.