Language switching is an important issue in the study of bilinguals. However, how the nontarget language affects the production of the target language under the language switch condition is still unclear, especially for bilinguals who speak two dialects (bidialectals). In the present study, we investigate this issue by a picture-naming task and using an event-related potential (ERP) technique. Two groups of proficient bidialectals, Mandarin (L1)–Cantonese (L2) and Cantonese (L1)–Mandarin (L2), participated in the study. They were required to name pictures with Mandarin or Cantonese under language switching conditions and language nonswitching conditions. The results showed that participants of both groups showed significant longer reaction time, larger P200, and smaller N400 under language switching conditions than that under language nonswitching conditions, which reflects the language switching costs. Moreover, participants of the two groups showed different P200 and N400 between Mandarin–Cantonese (MC) switching conditions and Cantonese–Mandarin (CM) switching conditions. Specifically, MC bidialectals showed larger P200 under the CM condition than that under the MC condition, whereas CM bidialectals showed an opposite P200 pattern (CM condition<MC condition). For N400, MC bidilactals showed smaller N400 under the MC condition than that under the CM condition, whereas CM bidilactals also showed a reverse N400 pattern (MC condition>CM condition). These findings supported the language-unspecific selection theory. Overall, our study is the first to provide electrophysiological evidence of language switching between two dialects.
Neuroreport – Wolters Kluwer Health
Published: Feb 7, 2018
It’s your single place to instantly
discover and read the research
that matters to you.
Enjoy affordable access to
over 12 million articles from more than
10,000 peer-reviewed journals.
All for just $49/month
It’s easy to organize your research with our built-in tools.
All the latest content is available, no embargo periods.
“Whoa! It’s like Spotify but for academic articles.”@Phil_Robichaud