Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Sensory experience modulates the reorganization of auditory regions for executive processing

Sensory experience modulates the reorganization of auditory regions for executive processing https://doi.org/10.1093/brain/awac205 BRAIN 2022: 145; 3698–3710 | 3698 Sensory experience modulates the reorganization of auditory regions for executive processing 1,† 2,† 1 3 Barbara Manini, Valeria Vinogradova, Bencie Woll, Donnie Cameron, 4 1 Martin Eimer and Velia Cardin These authors contributed equally to this work. Crossmodal plasticity refers to the reorganization of sensory cortices in the absence of their typical main sensory input. Understanding this phenomenon provides insights into brain function and its potential for change and enhancement. Using functional MRI, we investigated how early deafness influences crossmodal plasticity and the organization of ex- ecutive functions in the adult human brain. Deaf (n= 25; age: mean= 41.68, range= 19–66, SD= 14.38; 16 female, 9 male) and hearing (n= 20; age: mean= 37.50, range= 18–66, SD= 16.85; 15 female, 5 male) participants performed four visual tasks tapping into different components of executive processing: task switching, working memory, planning and inhib- ition. Our results show that deaf individuals specifically recruit ‘auditory’ regions during task switching. Neural activity in superior temporal regions, most significantly in the right hemisphere, are good predictors of behavioural perform- ance during task switching in the group of deaf individuals, highlighting the functional relevance of the observed cor- tical reorganization. Our results show executive processing in typically sensory regions, suggesting that the development and ultimate role of brain regions are influenced by perceptual environmental experience. 1 Deafness, Cognition and Language Research Centre and Department of Experimental Psychology, UCL, London WC1H 0PD, UK 2 School of Psychology, University of East Anglia, Norwich NR4 7TJ, UK 3 Norwich Medical School, University of East Anglia, Norwich NR4 7TJ, UK 4 Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK Correspondence to: Velia Cardin Deafness, Cognition and Language Research Centre and Department of Experimental Psychology UCL, London WC1H 0PD, UK E-mail: velia.cardin@ucl.ac.uk Keywords: deafness; executive function; auditory cortex Abbreviations: BSL= British Sign Language; BSLGJT= BSL grammaticality judgement task; EGJT= English grammaticality judgement task; HEF= higher executive function; HG= Heschl’s gyrus; LEF= lower executive function; pSTC= posterior superior temporal cortex; PT= planum temporale; ROI= region of interest individuals to understand the influence of early sensory experience Introduction on higher-order cognition and neural reorganization. Sensory systems feed and interact with all aspects of cognition. As Executive functions are higher-order cognitive processes re- such, it is likely that developmental sensory experience will affect sponsible for flexible and goal-directed behaviours, which have the organization of higher-order cognitive processes such as execu- been associated with activity in frontoparietal areas of the brain. tive functions. Here we studied executive processing in early deaf However, studies on deafness have shown reorganization for visual Received January 26, 2022. Revised April 20, 2022. Accepted May 20, 2022. Advance access publication June 2, 2022 © The Author(s) 2022. Published by Oxford University Press on behalf of the Guarantors of Brain. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which per- mits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited. Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3699 working memory in regions typically considered to be part of the were approved by the ethics committee of the School of 2–5 auditory cortex. These working memory responses in auditory Psychology at the University of East Anglia and the Norfolk and regions suggest that, in the absence of early sensory stimulation, Norwich University Hospital Research and Development a sensory region can change its function as well as the perceptual department. 6,7 modality to which it responds. The adaptation of sensory brain Participants were recruited through public events, social media regions to processing information from a different sensory modal- and participant databases of the University College London 7–19 ity is known as crossmodal plasticity. In deaf individuals, cross- Deafness, Cognition and Language Research Centre and the modal plasticity often refers to responses to visual or University of East Anglia School of Psychology. Participants were somatosensory stimuli in regions of the superior temporal cortex all right-handed (self-reported), had full or corrected vision and that in hearing individuals are typically involved in processing no history of neurological conditions. All participants were com- 7–11,14–19 sounds. The common assumption here, and in general pensated for their time, travel and accommodation expenses. when referring to crossmodal plasticity, is that the auditory cortex will preserve its sensory processing function, but process a differ- General procedure ent type of sensory input. The presence of working memory re- sponses in the auditory regions of deaf individuals takes the Participants took part in one behavioural and one scanning session. concept of crossmodal plasticity further, suggesting that, in the ab- The sessions took place on the same or different days. sence of early auditory stimulation, there is a shift from sensory to The behavioural session included: (i) Standardized reasoning and working memory tests: the Block Design cognitive processing in such regions. If this is the case, it would sug- subtest of the Wechsler Abbreviated Scale of Intelligence and the gest that cortical functional specialization for sensory or cognitive Corsi Block-tapping test implemented in psychology experiment build- processing is partially driven by environmental sensory experi- ing language software (PEBL, http://pebl.sourceforge.net/). ence. The aim of our study is to elucidate the role of the auditory (ii) Language tasks: four tasks were administered to assess language profi- cortex of deaf individuals in executive functions, to understand ciency in English and BSL in deaf participants (see the ‘Language assess- how sensory experience impacts cognitive processing in the brain. ment’ section). Specifically, we tested whether the auditory regions of deaf indivi- (iii) Pre-scanning training: the training session ensured that participants duals are involved in cognitive control or whether they have a role understood the tasks and reached accuracy of at least 75%. The tasks in specific subcomponents of executive functions. were explained in the participant’s preferred language (English or BSL). To address our aims, we conducted a functional MRI experiment A written description of all the tasks was provided to all participants (deaf and hearing) to support the experimenter’s explanation. in deaf and hearing individuals. Participants performed tasks tap- (iv) Audiogram screening: pure-tone averages were used to measure the de- ping into different executive functions: switching, working mem- gree of deafness in deaf participants. Copies of audiograms were pro- ory, planning and inhibition. If the auditory cortex of deaf vided by the participants from their audiology clinics or were collected individuals has a role in cognitive control, we would expect all tasks at the time of testing using a Resonance R17 screening portable audiom- to recruit this region. However, if the auditory areas of deaf indivi- eter. Participants included in the study had a mean pure-tone average duals are involved in specific subcomponents of executive func- >75 dB averaged across the speech frequency range (0.5, 1, 2 kHz) in tioning, these regions will be differentially activated by each of both ears (mean= 93.66± 7.79 dB; range: 78.33–102.5 dB). Four partici- the tasks. If neural activity in the reorganized auditory cortex can pants did not provide their audiograms, but they were all congenitally se- predict behavioural performance in deaf individuals, this will cor- verely or profoundly deaf and communicated with the researchers using 20,21 roborate the functional significance of such plasticity effect. BSL or relying on lipreading. During the scanning session, functional MRI data were acquired while participants performed four visual executive function tasks Materials and methods on switching, working memory, planning and inhibition (see de- Participants tails next). The order of the tasks was counterbalanced across participants. There were two groups of participants (see demographics in Supplementary Tables 1 and 2). (i) Twenty-nine congenitally or early (before 3 years of age) Experimental design severely-to-profoundly deaf individuals whose first language is British Sign Language (BSL) and/or English (Supplementary Table 3). We recruited All tasks were designed so that each had one condition with higher a larger number of deaf participants to reflect the language variability of executive demands (higher executive function, HEF) and one with the deaf population in the UK, as discussed in the ‘Language assessment’ lower demands (lower executive function; LEF) (Fig. 1). section. Datasets from three deaf participants were excluded from all For the switching task, participants had to respond to the shape analyses due to excessive motion in the scanner. One participant was ex- 25,26 of geometric objects, i.e. a rectangle and a triangle (Fig. 1). At the cluded because they only had a mild hearing loss in their best ear (pure- beginning of the run, participants were instructed to press a key tone average <25 dB). In total, 25 deaf participants were included in the with their left hand when they saw a rectangle and with their right analysis of at least one executive function task (see Supplementary hand when they saw a triangle. Each block started with a cue indi- Table 4 for details on exclusion). cating that the task was to either keep the rule they used in the pre- (ii) Twenty hearing individuals who are native speakers of English with no knowledge of any sign language. vious block (‘stay’ trials; LEF) or to switch it (‘switch’ trials; HEF). In the switch trials, participants had to apply the opposite mapping Deaf and hearing participants were matched on age, between the shape and the response hand. Each block included gender, reasoning and visuospatial working memory span the presentation of the instruction cue (200 ms), a fixation cross (Supplementary Table 2). (500 ms) and two to five task trials. During each trial, a geometrical All participants gave written informed consent. All procedures shape (either a blue rectangle or a blue triangle) was shown at the followed the standards set by the Declaration of Helsinki and centre of the screen until the participant responded for a max of 3700 | BRAIN 2022: 145; 3698–3710 B. Manini et al. Figure 1 Executive function tasks. Each task had a higher executive demands condition (HEF, purple) and a lower executive demands condition (LEF, peach). See the ‘Materials and methods’ section for details of the design. 1500 ms. Visual feedback (500 ms) followed the participant’s re- hand when their choice was on the right. The maximum display sponse. There were 230 trials in 80 blocks of either the LEF (40) or time for each stimulus was 30 s. The duration of the inter-trial HEF (40) condition. The analysis for the HEF condition only included interval was jittered between 2000 and 3500 ms. There were 30 trials the first trial of the switch block (see ‘Statistical analysis of behav- in the Tower of London condition and 30 trials in the control ioural performance’ section). condition. 27,28 31 For the working memory task, we used a visuospatial task con- To study inhibitory control, we used Kelly and Milham’s ver- trasted with a perceptual control task (Fig. 1). A visual cue (1500 ms) in- sion of the classic Simon task (https://exhibits.stanford.edu/data/ dicated which task participants should perform. The cue was followed catalog/zs514nn4996). A square appeared on the left or the right bya3× 4 grid. Black squares were displayed two at a time at random side of the fixation cross. The colour of the squares was the relevant locations on the grid for 1000 ms, three times. In the HEF condition, aspect of the stimuli, with their position irrelevant for the task. participants were asked to memorize the six locations. Then they indi- Participants were instructed to respond to the red square with the cated their cumulative memory for these locations by choosing be- left hand and the green square with the right hand. In the congruent tween two grids in a two-alternative, forced-choice paradigm via a condition (LEF), the button press response was spatially congruent button press. The response grids were displayed until the participant with the location of the stimuli (e.g. the right-hand response for a responded, or for a maximum of 3750 ms. In the control condition square appearing on the right side of the screen) (Fig. 1). In the incon- (LEF), participants indicated whether a blue square was present in gruent condition (HEF), the correct answer was in the opposite loca- anyofthe grids, ignoring theconfiguration of the highlighted squares. tion in respect to the stimulus. Half of the trials were congruent, and Trials were separated by an inter-trial interval with duration jittered half were incongruent. Each stimulus was displayed for 700 ms, between 2000 and 3500 ms. Each experimental run had 30 working with a response window of up to 1500 ms. The inter-trial memory trials and 30 control trials. interval was 2500 ms for most trials, with additional blank intervals For the planning task, we used a computer version of the classic of 7.5 s (20), 12.5 s (2) and 30 s (1). Participants completed one or two 29,30 Tower of London task (Fig. 1). In each trial, two configurations of runs of this task, each consisting of a maximum of 200 trials. coloured beads placed on three vertical rods appeared on a grey screen, with the tallest rod containing up to three beads, the middle Statistical analysis of behavioural performance rod containing up to two beads and the shortest rod containing up to one bead. In the Tower of London condition (HEF), participants had Averaged accuracy (%correct) and reaction time were calculated. to determine the minimum number of moves needed to transform For each participants’ set of reaction times, we excluded outlier va- the starting configuration into the goal configuration following lues where participants responded too quickly or where they took a two rules: (i) only one bead can be moved at a time; and (ii) a bead long time to respond. We did this by calculating each participant’s cannot be moved when another bead is on top. There were four le- interquartile range separately, and then removing values that were vels of complexity, depending on the number of moves required >1.5 interquartile ranges below the first quartile or above the third (2, 3, 4 and 5). In the control condition (LEF), participants were asked quartile of the data series. Differences between groups on accuracy to count the number of yellow and blue beads in both displays. For or reaction time were investigated with repeated-measures both conditions, two numbers were displayed at the bottom of the ANOVAs with between-subjects factor group (hearing, deaf) and screen: one was the correct response and the other was incorrect within-subjects factor condition (LEF, HEF). by +1 or −1. Participants answered with their left hand when they In the switching task, the accuracy switch cost (SwitchCost ) ACC chose the number on the left side of the screen, and with their right was calculated as the difference in the percentage of errors (% Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3701 errors) between the first switch trial of a switch block and all stay obtained during this step was used for normalization of the function- trials. reaction time switch cost (SwitchCost ) was calculated as al scans. Susceptibility distortions in the echo-planar images were RT the difference in reaction time between the first switch trial of a estimated using a field map that was coregistered to the blood oxy- 33,34 switch block and all stay trials. genation level-dependent (BOLD) reference. Images were rea- In the inhibition task, the Simon effect was calculated as the dif- ligned using the precalculated phase map, coregistered, slice-time ference in %errors or reaction time between the incongruent and corrected, normalized and smoothed (using an 8-mm full-width at congruent trials. half-maximum Gaussian kernel). All functional scans were checked for motion and artefacts using the ART toolbox (https://www.nitrc. org/projects/artifact_detect). Image acquisition Images were acquired at the Norfolk and Norwich University Functional MRI first-level analysis Hospital in Norwich, UK, using a 3 T wide bore GE 750W MRI scanner and a 64-channel head coil. Communication with the deaf partici- The first-level analysis was conducted by fitting a general linear pants occurred in BSL through a close-circuit camera, or through model with regressors of interest for each task (see details next). written English through the screen. An intercom was used for com- All the events were modelled as a boxcar and convolved with munication with hearing participants. All volunteers were given SPM’s canonical haemodynamic response function. The motion ear protectors. Stimuli were presented with PsychoPy software parameters, derived from the realignment of the images, were (https://psychopy.org) through a laptop (MacBook Pro, Retina, added as regressors of no interest. Regressors were entered into a 15-inch, Mid 2015). All stimuli were projected by an AVOTEC’s multiple regression analysis to generate parameter estimates for Silent Vision projector (https://www.avotecinc.com/high- each regressor at every voxel. resolution-projector) onto a screen located at the back of the mag- For the switching task, the first trial of each switch block (HEF) net’s bore. Participants watched the screen through a mirror and all stay trials (LEF) were modelled as regressors of interest sep- mounted on the head coil. Button responses were recorded via arately for the left- and right-hand responses. The cues and the re- fibre-optic response pad button boxes (https://www.crsltd.com/ maining switch trials were included as regressors of no interest. tools-for-functional-imaging/mr-safe-response-devices/forp/). For the working memory task, the conditions of interest were work- Functional imaging data were acquired using a gradient-recalled ing memory (HEF) and control (LEF). The onset was set at the presenta- echo planar imaging sequence [50 slices, repetition time (TR)= tion of the first grid, with the duration set at 3.5 s (i.e. the duration of 3000 ms, echo time (TE)= 50 ms, field of view (FOV)= 192× the three grids plus a 500 ms blank screen before the appearance of 192 mm, 2 mm slice thickness, distance factor 50%] with an in- the response screen; Fig. 1). Button responses were included separately plane resolution of 3 × 3 mm. The protocol included six functional for each hand and condition as regressors of no interest. scans: five task-based functional MRI scans (switching: 10.5 min, For the planning task, the Tower of London (HEF) and control 210 volumes; working memory: 11 min, 220 volumes; planning: (LEF) conditions were included in the model as regressors of inter- 11.5 min, 230 volumes; inhibition: two runs of 10 min, 200 volumes est, with onsets at the beginning of each trial and duration set to the each) and one resting state scan (part of a different project, and to trial-specific reaction time. Button responses were modelled separ- be reported in a different paper). Some participants did not com- ately for each hand as regressors of no interest. plete all functional scans (Supplementary Table 4). An anatomical For the inhibition task, four regressors of interest were obtained T -weighted scan [IR-FSPGR, inversion time= 400 ms, 1 mm slice by combining the visual hemifield where the stimulus appeared thickness] with an in-plane resolution of 1 × 1 mm was acquired with the response hand [(i) right visual hemifield—left hand; (ii) during the session. left visual hemifield—right hand; (iii) right visual hemifield—right Raw B0 field map data were acquired using a 2D multi-echo hand; (iv) left visual hemifield—left hand]. Right visual hemifield— gradient-recalled echo sequence with the following parameters: TR = left hand and left visual hemifield—right hand were the incongruent 700 ms, TE= 4.4 and 6.9 ms, flip angle= 50°, matrix size= 128 × 128, conditions (HEF), whereas the right visual hemifield-right hand and FOV= 240 mm × 240 mm, number of slices= 59, thickness= 2.5 mm left visual hemifield-left hand were the congruent conditions (LEF). and gap= 2.5 mm. Real and imaginary images were reconstructed 33–35 for each TE to permit calculation of B0 field maps in Hz. Region of interest analysis We conducted a region of interest (ROI) analysis to investigate Functional MRI preprocessing crossmodal plasticity and differences between groups in the audi- Functional MRI data were analysed with MATLAB 2018a tory cortex. Three auditory regions of the superior temporal cortex (MathWorks, MA, USA) and Statistical Parametric Mapping soft- were included in this analysis: Heschl’s gyrus (HG), the planum ware (SPM12; Wellcome Trust Centre for Neuroimaging, London, temporale (PT) and the posterior superior temporal cortex (pSTC) UK). The anatomical scans were segmented into different tissue (Fig. 2). HG and the PT were defined anatomically, using classes: grey matter, white matter and CSF. Skull-stripped anatom- FreeSurfer software (https://surger.nmr.mgh.harvard.edu). Full 38,39 ical images were created by combining the segmented images using descriptions of these procedures can be found elsewhere, but, the Image Calculation function in SPM (ImCalc, http://tools. in short, each participant’s bias-corrected anatomical scan was robjellis.net). The expression used was: {[i1.*(i2 + i3 + i4)]> thresh- parcellated and segmented, and voxels with the HG label and the old}, where i1 was the bias-corrected anatomical scan and i2, i3 PT label were exported using SPM’s ImCalc function (http:// and i4 were the tissue images (grey matter, white matter and CSF, robjellis.net/tools/imcalc_documentation.pdf). Participant-specific respectively). The threshold was adjusted between 0.5 and 0.9 to ROIs were then normalized to the standard MNI space using the de- achieve adequate brain extraction for each participant. Each parti- formation field from the normalization step of the preprocessing. cipant’s skull-stripped image was normalized to the standard MNI The pSTC was specified following findings from Cardin et al. space (Montreal Neurological Institute) and the deformation field where a visual working memory crossmodal plasticity effect was 3702 | BRAIN 2022: 145; 3698–3710 B. Manini et al. found in right and left pSTC in deaf individuals [left: −59, −37, 10; were analysed using a backward data entry method in JASP. The right: 56, −28, −1]. Right and left functional pSTC ROIs were defined default stepping method criteria were used, where predictors with using data from Cardin et al., with the contrast [deaf (working mem- P < 0.05 are entered into the model and those with P> 0.1 are re- ory> control task)> hearing (working memory> control task)] (P< moved until all predictors fall within these criteria. SwitchCost RT 0.005, uncorrected). and SwitchCost were entered as dependent variables in separate ACC There was an average partial overlap of 8.2 voxels (SD = 6.86) be- analyses. Each regression analysis had three covariates: neural tween left PT and left pSTC, with no significant difference in overlap switch cost in the right hemisphere, neural switch cost in the left between groups (deaf: mean= 9.92, SD= 7.02; hearing: mean= 6.05, hemisphere and language. SD= 6.17). To ensure that the two ROIs were independent, common Neural switch cost (BOLD − BOLD ) was calculated in switch stay voxels were removed from left PT in a subject-specific manner. ROIs with significant differences between the switch and stay con- Removing the overlapping voxels did not qualitatively change the dition in the deaf group. The average neural activity in all stay trials results. (BOLD ) was subtracted from the average activity in the first stay Parameter estimates for each participant were extracted from switch trials (BOLD ), and then averaged across ROIs separately switch each ROI using MarsBaR v.0.44 (http://marsbar.sourceforge.net). in the right and left hemisphere. The data were analysed using JASP (https://jasp-stats.org) and en- tered into separate mixed repeated measures ANOVAs for each Data availability task and set of ROIs. Factors in the ANOVAs on the temporal ROIs The data and analysis files for this paper can be found at https://osf. included: the between-subjects factor group (hearing, deaf) and io/uh2ap/. the within-subjects factors ROI (HG, PT, pSTC), hemisphere (left, right) and condition (LEF, HEF). The Greenhouse–Geisser correction was applied when the Results assumption of sphericity was violated. Significant interactions Behavioural results and effects of interest were explored with Student’s t-tests or Mann–Whitney U-tests when the equal variance assumption Deaf (n= 25) and hearing (n= 20) individuals were scanned while was violated. performing four executive function tasks: switching, working memory, planning and inhibition (Fig. 1). Behavioural results from all tasks are shown in Fig. 3. To explore differences in perform- Language assessment ance between groups, we conducted 2 × 2 repeated-measures We recruited a representative group of the British deaf population, ANOVAs for each task, with either accuracy or reaction time as who usually have different levels of proficiency in sign and spoken the dependent variable, between-subjects factor group (hearing, language. This was: (i) to study plasticity in a representative group deaf) and within-subjects factor condition (HEF, LEF). Results of deaf individuals; and (ii) to study the relationship between lan- show a significant main effect of condition for both accuracy and guage experience and the organization of cognitive networks of reaction time in all tasks, confirming that the HEF condition was the brain, which will be reported in a separate paper. more difficult and demanding than the LEF condition To assess the language proficiency of deaf participants, we chose (Supplementary Table 5). grammaticality judgement tests measuring language skills in The group of deaf individuals had significantly slower reaction English and BSL. The BSL grammaticality judgement task (BSLGJT) times in all tasks (Supplementary Table 5). Switching was the is described in Cormier et al. and the English grammaticality only task where there was a significant main effect of group on ac- judgement task (EGJT) was designed based on examples from curacy [F(1,41) = 4.32, P= 0.04, η = 0.09], as well as a Condition× Linebarger et al. The BSLGJT and the EGJT use a single method of Group interaction [F(1,41) = 4.98, P= 0.03, η = 0.11]. A post hoc t-test assessing grammaticality judgements of different syntactic struc- revealed a significant between-groups difference, where the group tures in English and BSL. Grammaticality judgement tests have of deaf individuals was significantly less accurate than the group of been used in deaf participants before and have proved to be efficient hearing individuals in the switch condition [t(41)=−2.22, P= in detecting differences in language proficiency among participants 0.03, Cohen’s d = 0.68]. The difference in SwitchCost (% ACC 42,44 with varying ages of acquisition. Deaf participants performed errors − %errors )reflects the significant interaction, with switch stay both the BSL and English tests if they knew both languages, or the deaf group [mean = 10.24, SD= 9.89, t(22) = 4.96, P < 0.001, d= only the English tests if they did not know BSL. 1.03] having a larger SwitchCost than the hearing group [mean ACC To control for potential language proficiency effects, we com- = 4.18; SD= 7.53, t(19) = 2.49, P = 0.02, d = 0.56; Fig. 3A]. bined results from the EGJT and BSLGJT to create a single, modality-independent measure of language proficiency in the Functional MRI results deaf group. Accuracy scores in the EGJT (%correct; mean= 83.51, SD= 11.4, n= 25) and BSLGJT (mean = 77.88, SD= 13.1, n= 21) were Functional MRI results show that all executive function tasks ac- transformed into z-scores separately for each test. For each partici- tivated typical frontoparietal regionsinbothgroupsofpartici- pant, the EGJT and BSLGJT z-scores were then compared, and the pants (Supplementary Fig. 2). Thereweresignificantly stronger higher one was chosen for a combined modality-independent lan- activations in the HEF condition in the switching, working mem- guage proficiency score (Supplementary Fig. 1). ory and planning tasks. These included commonly found activa- tions in frontoparietal areas, such as dorsolateral prefrontal cortex, frontal eye fields, presupplementary motor area and in- Multiple linear regression traparietal sulcus. In the inhibition task, the HEF incongruent Multiple linear regression analyses were conducted to investigate condition resulted in stronger activation in intraparietal sulcus whether neural activity in the superior temporal cortex of deaf indi- and left frontal eye fields, but there were no significant differ- viduals can predict performance in the switching task. The data ences between conditions. Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3703 Figure 2 Temporal ROIs. Temporal regions included in the analysis: HG, PT and superior temporal cortex (pSTC). HG and PT were defined anatomically, in a subject-specific manner, using the FreeSurfer software package. The figure shows the overlap of all subject-specific ROIs. Common voxels be- tween left PT and left pSTC have been subtracted from left PT (see ‘Materials and methods’ section). The pSTC was defined functionally, based on the findings of Cardin et al. (see the ‘Materials and methods’ section). To investigate crossmodal plasticity and executive processing in during the switching task. We conducted two separate multiple lin- the auditory cortex of deaf individuals, we conducted a ROI analysis ear regression analyses, one with SwitchCost and one with RT on superior temporal auditory ROIs. These included: HG, the PT and SwitchCost as dependent variables (Table 2). The covariates in- ACC the pSTC (Fig. 2). Differences and interactions between groups are cluded in the model were: right hemisphere neural switch cost, left discussed next, and we first present results from the switching hemisphere neural switch cost and language z-scores. For the neur- task where we observed the strongest activations of temporal al swich cost covariates, data was averaged from ROIs in the right ROIs in the deaf group (Fig. 4). Results from all other tasks are dis- and left hemisphere to reduce the number of dimensions in the cussed in the following subsection. multiple linear regression models. To do this, we calculated the neural switch cost (BOLD − BOLD ) for each ROI with signifi- switch stay cant differences in activity between the switch and stay conditions Task switching activates auditory areas in deaf in the deaf group (Fig. 4 and Supplementary Table 6), and we then individuals and this activation predicts behaviour averaged neural switch cost separately for ROIs in the right and Of the four tasks that we tested, only in the switching task we found left hemisphere. We also included language as a covariate in our both a significant main effect of group [F(1,41)= 15.48, P < 0.001, η = models because language proficiency has been shown to modulate 45–48 0.27] and a significant interaction between Group × Condition performance in executive function tasks in deaf individuals. [F(1,41) = 4.75, P= 0.03, η = 0.10] (Table 1). The interaction was dri- Results from the multiple linear regression analysis using back- ven by a significant difference between conditions in the deaf ward data entry show that neural activity in temporal ROIs can sig- group, but not in the hearing group [deaf : t(22) = 4.06, P= nificantly predict SwitchCost in the deaf group (Table 2). The HEFvLEF RT <0.001, d = 0.85; hearing : t(19)= 0.26, P = 0.79, d = 0.06]. To most significant model included both right and left hemisphere HEFvLEF test whether differences between conditions were significant be- neural switch cost as covariates, and explained 40.6% of the 2 2 tween the switch and stay condition in all ROIs, we conducted variance [F(2,18) = 6.15, P= 0.009, R = 0.41, adjusted R = 0.34; post hoc t-tests in each ROI and group. This accounted for a total Table 2, top section]. There was a positive association between of 12 separate t-tests, and to correct for multiple comparisons, we SwitchCost and neural switch cost in right hemisphere temporal RT only considered significant those results with P < 0.004 (P < 0.05/12 areas (B= 0.04, SE= 0.01, β= 0.99; P= 0.003). This means that for = 0.004; corrected P < 0.05). We found significant differences be- every unit increase in neural switch cost in right temporal areas, tween the switch and stay condition in all the left hemisphere there is an increase of 40 ms in SwitchCost . In standardized RT ROIs and in the right PT and right pSTC in the deaf group terms, as neural switch cost increases by 1 SD, SwitchCost RT (Fig. 4 and Supplementary Table 6). increases by 0.99 SDs. On the other hand, there was a negative To investigate the behavioural relevance of the observed cross- association between the left hemisphere neural and modal plasticity, we evaluated whether neural activity in the super- SwitchCost . However, this was only significant in the full model RT ior temporal cortex of deaf individuals can predict performance (P = 0.031, B = −0.02, SE= 0.01, β=−0.69), but not in the best 3704 | BRAIN 2022: 145; 3698–3710 B. Manini et al. Figure 3 Behavioural performance. The figure shows average accuracy (%correct) and reaction time (RT, in seconds) for each task and condition in the hearing and the deaf groups. It also shows the average switch costs and Simon effects for both accuracy and reaction time in each group. The SwitchCost and Simon effect are calculated and plotted using %error instead of %correct, so that larger values indicate an increase in cost. Only ACC the first trials of the switch blocks were included in the HEF condition. The bold lines in the box plots indicate the median. The lower and upper hinges correspond to the first and third quartiles. Statistically significant (P< 0.05) differences between conditions are not shown in the figure, but were found for all tasks in both groups (Supplementary Table 5). **P< 0.01; *P< 0.05. model (P = 0.05, B=−0.02, SE= 0.01, β=−0.61; Table 2). There was no decrease of 12.6 units in SwitchCost . In standardized terms, as ACC significant association between SwitchCost and language language z-scores increased by 1 SD, SwitchCost decreased by RT ACC (B= −0.06, SE = 0.05, β=−0.23; P = 0.22). 0.45 SDs. When evaluating whether neural switch cost could also predict SwitchCost , we found no significant association between these ACC Recruitment of auditory areas in deaf individuals is variables (Table 2, bottom section). Instead, the most significant not ubiquitous across executive function tasks model included only language as a regressor (Table 2), explaining 20.7% of the variance [F(1,19) = 4.96, P= 0.04, R = 0.21, adjusted Results from the working memory, planning and inhibition tasks R = 0.16]. For every unit increase in language z-scores, there is a are shown in Fig. 5. In the working memory task, there was a Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3705 Discussion We investigated how early sensory experience impacts the organ- ization of executive processing in the brain. We found that, in deaf individuals, primary and secondary auditory areas are re- cruited during a visual switching task. These results indicate that the sensory or cognitive specialization of cortical regions in the adult brain can be influenced by developmental sensory experi- ence. It is possible that an early absence of auditory inputs results in a shift of functions in regions typically involved in auditory pro- cessing, with these regions then adopting a role in specific compo- nents of executive processing. Neural activity in temporal regions during the switching task predicted performance in deaf indivi- duals, highlighting the behavioural relevance of this functional shift. Our design allowed us to thoroughly examine the role of audi- tory regions in different executive function tasks and determine whether these regions are involved in cognitive control. Previous studies have suggested an involvement of auditory cortex during 4,5 higher-order cognitive tasks in deaf individuals, but given the fo- cus on a single task, with an experimental and control condition, they cannot inform whether plasticity effects are specific to the de- mands of the task. Our design included four different visuospatial executive function tasks, all with an experimental (HEF) and control (LEF) condition, probing a variety of executive processes. We found that the HEF condition in all tasks recruited frontoparietal areas Figure 4 Switching task analysis. (A) Neural activity in temporal ROIs. typically involved in executive functioning and cognitive control. ***P< 0.005; ****P < 0.001. (B) Partial correlation plot between However, only switching resulted in significant activations in tem- SwitchCost and neural switch cost in right temporal ROIs in the group RT of deaf individuals. Partial correlation from a multiple linear model with poral auditory regions in the deaf group. This finding demonstrates SwitchCost as the dependent variable and the following covariates: RT that the auditory cortex of deaf individuals serves a specific sub- right hemisphere neural switch cost, left hemisphere neural switch component of executive functioning during switching, and not a cost, and language. shared computation across tasks, such as cognitive control. This was not only found in higher-order auditory areas, but also in the left HG, showing that a functional shift towards cognition can in- deed occur in primary sensory regions. A significant activation dur- significant Condition× Group interaction [Table 1, F(1,41) = 6.41, P= ing the switching condition in the left, but not the right HG, 0.01, η = 0.13], but differences between conditions within each provides further evidence for different roles of left and right tem- group were not significant [hearing : t(18) =−1.74, P = 0.10, d HEFvLEF poral regions in deaf individuals (see Cardin et al. for a review). =−0.40; deaf : t(23) = 1.81, P= 0.08, d = 0.37]. In the planning HEFvLEF Differences in the recruitment of the left and right HG in this study task, there was a significant main effect of group [F(1,38)= 5.85, P may be linked to the specialization of these regions for sound pro- = 0.02, η = 0.13], but this was driven by significant deactivations cessing in hearing individuals. In this group, left HG is specialized in the hearing group [t(18)=−4.47, P < 0.001, d =−1.00], with no sig- for the temporal processing of auditory signals, whereas the right nificant difference in activity from baseline in the deaf group HG shows stronger sensitivity to spectral components. The [t(20)=−1.31, P = 0.21, d = −0.29]. In the inhibition task, there was switching task in this study requires tracking a sequence of stimuli a significant interaction between ROI and Group [F(1.89,66.05= in time, while the extraction of spectral or frequency information is 3.92, P = 0.03, η = 0.10]. However, there were no significant differ- not needed in this task, which could explain the different recruit- ences between groups in any ROI (https://osf.io/9fuec). Instead, ment of HG across hemispheres. The fact that the right HG was the ROI × Group interaction was driven by a main effect of ROI in not recruited during the switching task, while right PT and pSTC the deaf group (higher activations for PT and pSTC than HG, were, also suggests a functional difference in crossmodal plasticity https://osf.io/2z35e/), which was not present in the hearing group between primary and secondary auditory regions. Primary auditory (https://osf.io/gmy6v/). Table 1 Group main effects and group interactions for all tasks in the ROI analysis Switching Working memory Planning Inhibition F (d.f.) PF (d.f.) PF (d.f.) PF (d.f.) P Group 15.48 (1,41) <0.001 0.04 (1,41) 0.85 5.85 (1,38) 0.02 0.03 (1,35) 0.87 Condition× Group 4.75 (1,41) 0.03 6.40 (1,41) 0.01 0.56 (1,38) 0.46 0.18 (1,35) 0.67 ROI× Group 3.42 (1.9,79.1) 0.04 1.18 (1.7,68.4) 0.30 0.73 (1.7,64.6) 0.46 3.92 (1.9,66.1) 0.03 Hemisphere× Group 0.009 (1,41) 0.92 0.01 (1,41) 0.93 0.46 (1,38) 0.50 0.30 (1,35) 0.59 Significant results are indicated in bold. Full results for each ANOVA can be found in OSF: https://osf.io/dt827/. 3706 | BRAIN 2022: 145; 3698–3710 B. Manini et al. Table 2 Multiple linear regression predicting behavioural performance in the switching task SwitchCost RT Model summary 2 2 Model R Adjusted R FP 1 0.46 0.36 4.78 0.01 2 0.41 0.34 6.15 0.009 Coefficients Model Unstandardized SE Standardized tP 1 (Intercept) 0.03 0.04 0.63 0.53 Language score −0.06 0.05 −0.23 −1.27 0.22 LH neural switch cost −0.02 0.01 −0.69 −2.35 0.03 RH neural switch cost 0.04 0.01 1.05 3.60 0.002 2 (Intercept) −0.01 0.03 −0.51 0.62 LH neural switch cost −0.02 0.01 −0.61 −2.09 0.05 RH neural switch cost 0.04 0.01 0.99 3.39 0.003 SwitchCost ACC Model summary Model R² Adjusted R² FP 1 0.28 0.16 2.26 0.12 2 0.28 0.20 3.55 0.05 3 0.21 0.16 4.96 0.04 Coefficients Model Unstandardized SE Standardized tP 1 (Intercept) 15.84 4.82 3.28 0.004 Language score −12.85 5.83 −0.46 −2.20 0.04 LH neural switch cost −0.28 1.25 −0.08 −0.22 0.82 RH neural switch cost 1.41 1.41 0.33 1.00 0.33 2 (Intercept) 15.78 4.69 3.37 0.003 Language score −12.57 5.55 −0.45 −2.27 0.04 RH neural switch cost 1.16 0.84 0.27 1.38 0.18 3 (Intercept) 18.90 4.20 4.50 <0.001 Language score −12.64 5.68 −0.45 −2.23 0.04 Significant results are indicated in bold. LH= left hemisphere; RH= right hemisphere; SE= standard error. regions are the first cortical relay of auditory inputs and have stron- associative region involved in reorientation of attention to ger subcortical inputs from the thalamus, while secondary re- task-relevant information, such as contextual cues or target stim- 55,56 gions might be more likely to be modulated by top-down uli. Regions of the right middle temporal gyrus have also been influences, potentially driving plastic reorganization in different di- shown to be involved in task switching and to encode task-set re- rections. Further studies focusing on finer-grain mapping of cross- presentations. In the absence of auditory inputs throughout de- modal plasticity effects in the auditory cortex of deaf individuals velopment, the proximity to the TPJ and the middle temporal are needed to elucidate these processes. gyrus may result in changes in the microcircuitry or in the compu- Task switching requires cognitive flexibility and shifting be- tations performed by the adjacent auditory cortices, where these 51,52 tween different sets of rules. Shifting is considered one of the regions now perform computations that allow switching between 7,54,58 core components of executive control. It is defined as the ability tasks. This is particularly relevant for the right hemisphere, to flexibly shift ‘back and forth between multiple tasks, operations where activity in auditory regions was more strongly linked to be- or mental sets’. Shifting is also an important component of work- havioural outcomes in the switching task in the group of deaf ing memory tasks previously shown to recruit posterior superior individuals. temporal regions in deaf individuals (e.g. two-back working mem- Another possibility is that the recruitment of ‘auditory’ tem- 4,5 ory, visuospatial delayed recognition ). In the present study, the poral regions for switching observed in deaf adults reflects vestigial working memory task did not significantly activate any temporal functional organization present in early stages of development. ROIs. The working memory task used in this study requires updat- Research on hearing children has found activations in bilateral oc- ing of information and incremental storage, but no shifting be- cipital and superior temporal cortices during task switching, with tween targets or internal representations of stimuli, as required a similar anatomical distribution to the one we find here. Our find- in an n-back task. Together, these results suggest that previous ings in deaf individuals suggest that executive processing in tem- working memory effects in superior temporal regions are not ne- poral cortices could be ‘displaced’ by persistent auditory inputs cessarily linked to storage, updating or control, but are more likely that, as the individual develops, may require more refined process- linked to shifting between tasks or mental states. ing or demanding computations. Thus, an alternative view is that A change of function in the auditory cortex, specifically in the regions considered to be ‘sensory’ have mixed functions in infants right hemisphere, could be explained by the anatomical proximity and become more specialized in adults. These regions could follow to the middle temporal lobe or to the parietal lobe, specifically the different developmental pathways influenced by environmental 7,54 temporoparietal junction (TPJ). Right TPJ is a multisensory sensory experience. As such, the temporal regions of hearing Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3707 Figure 5 ROI results from the working memory (A), planning (B) and inhibition (C) tasks. Ctr= control; WM= working memory; ToL= Tower of London; Con= congruent; Inc= incongruent. individuals will become progressively more specialized for sound rather, by language proficiency. Executive performance has been processing, whereas, in deaf individuals, they will become more previously associated with language proficiency in deaf chil- 47,48,62–64 specialized for subcomponents of executive processing. dren. While in our study language z-scores predict only The direct relationship between behavioural outcomes and ac- 20.7% of the variance in SwitchCost and the model was only sig- ACC tivity in reorganized cortical areas is robust evidence of the func- nificant at P < 0.05, our findings suggest that language development tional importance of the observed crossmodal plasticity. We can have long-lasting effects on executive processing throughout found that neural activity, specifically in the right temporal ROI, the lifespan. Different theories propose that language can provide predicted reaction times in the switching task in the deaf group. the necessary framework for higher-order (if–if–then) rules to de- Specifically, higher neural switch cost was linked to higher reaction velop and be used in a dynamic task in the most efficient 65,66 time switch cost (SwitchCost ), which suggests effortful process- way. These hierarchical ‘if–then’ rules could be implemented, RT ing, as previously described in other cognitive tasks with different in an automatic way, to solve the arbitrary link between stimulus 60,61 levels of complexity. It is important to highlight that there and response during switching. Although participants are not re- were no differences in SwitchCost between the groups, showing quired to use linguistic strategies during switching, we speculate RT that the potential reliance on different neural substrates to solve that those who have benefited from the efficiency associated with the switching task does not translate into differences in perform- developing such frameworks can invest fewer cognitive resources ance. In fact, significant interactions between group and condition into solving this task. While the role of language in executive pro- for the switching task were only found in accuracy (SwitchCost ), cessing needs further investigation, it is important to consider ACC which in our analysis was not predicted by neural activity, but that the timely development of a first language may boost the 3708 | BRAIN 2022: 145; 3698–3710 B. Manini et al. overall efficiency of a cognitive task, in this case switching, regard- Funding less of whether the task itself allows implementation of purely lin- This work was funded by a grant from the Biotechnology and guistic mechanisms. Biological Sciences Research Council (BBSRC; BB/P019994). V.V. is It is important to take into account that all signers of BSL are bilin- funded by a scholarship from the University of East Anglia . gual to a greater or lesser degree, depending on their early language background, degrees of deafness and educational experiences. Bilinguals who frequently change languages have generally been Competing interests shown to have an advantage in executive function switching 68–70 tasks. However, it is unlikely that differences in bilingualism The authors report no competing interests. can explain our findings in this study. If different results between deaf and hearing participants were due to the presence or not of bi- lingualism, we would have expected the group of deaf individuals to Supplementary material have a behavioural advantage in the switching task, but that was the Supplementary material is available at Brain online. opposite of what we found. In addition, we have previously shown that working memory responses in the superior temporal cortex of deaf individuals cannot be explained by bilingualism. In our previ- References ous study, we compared deaf native signers to two groups of hearing individuals: (i) hearing native signers, who were bilingual in English 1. D’Esposito M, Grossman M. The physiological basis of executive and BSL (bimodal bilinguals); and (ii) hearing non-signers who were function and working memory. The Neuroscientist. 1996;2:345– bilingual in English and another spoken language (unimodal bilin- 352. 2. Andin J, Holmer E, Schönström K, Rudner M. Working memory guals). These three populations were comparably proficient in both for signs with poor visual resolution: fMRI evidence of reorgan- their languages. We found differences in the recruitment of superior ization of auditory cortex in deaf signers. Cereb Cortex. 2021;31: temporal regions between deaf individuals and both groups of hear- 3165–3176. ing participants during a working memory task, suggesting a cross- 3. Buchsbaum B, Pickell B, Love T, Hatrak M, Bellugi U, Hickok G. modal plasticity effect driven by different sensory experience. Neural substrates for verbal working memory in deaf signers: These effects in the superior temporal cortex could not be explained fMRI study and lesion case report. Brain Language. 2005;95:265– by bilingualism, because this was controlled across groups. In the present study, significant activations during the switching condition 4. Cardin V, Rudner M, de Oliveira RF, et al. The organization of were found in the same areas where we previously found working working memory networks is shaped by early sensory experi- memory activations in deaf individuals (left and right pSTC, which ence. Cerebral Cortex. 2018;28:3540–3554. were defined functionally based on our previous findings; see the 5. Ding H, Qin W, Liang M, et al. Cross-modal activation of auditory ‘Materials and methods’ section), suggesting that these regions are regions during visuo-spatial working memory in early deafness. involved in specific subcomponents of executive processing as a con- Brain. 2015;138:2750–2765. sequence of early deafness. 6. Bedny M. Evidence from blindness for a cognitively pluripotent In addition, as a group, deaf participants had significantly longer re- cortex. Trends Cogn Sci. 2017;21:637–648. action times in all tasks. This is at odds with behavioural results from 7. Cardin V, Grin K, Vinogradova V, Manini B. Crossmodal reorgan- studies of deaf native signers, where the performance of this group in isation in deafness: mechanisms for functional preservation executive function tasks is comparable to or faster than that of typic- and functional change. Neurosci Biobehav Rev. 2020;113:227–237. 46 48 ally hearing individuals (e.g. Hauser et al., Marshall et al., and 8. Bola Ł, Zimmermann M, Mostowski P, et al. Task-specific re- Cardin et al. ). Native signers achieve language development mile- organization of the auditory cortex in deaf humans. Proc Natl stones at the same rate as that of hearing individuals learning a spoken Acad Sci USA. 2017;114:E600–E609. 9. Corina DP, Blau S, LaMarr T, Lawyer LA, Coffey-Corina S. language, highlighting again the importance of early language access, Auditory and visual electrophysiology of deaf children with not only for communication but also for executive processing. Deaf in- cochlear implants: Implications for cross-modal plasticity. dividuals also have faster reaction times in studies of visual reactiv- 21,71 Front Psychol. 2017;8:59. ity, suggesting critical differences in performance between purely 10. Simon M, Campbell E, Genest F, MacLean MW, Champoux F, perceptual tasks, and those which weigh more strongly on executive Lepore F. The impact of early deafness on brain plasticity: A sys- demands, where language experience and early language acquisition tematic review of the white and gray matter changes. Front could have a longer-lasting effect throughout the lifespan. Neurosci. 2020;14:206. In conclusion, we show that components of executive process- 11. Lomber SG, Meredith MA, Kral A. Cross-modal plasticity in spe- ing, such as switching, can be influenced by early sensory experi- cific auditory cortices underlies visual compensations in the ence. Our results suggest that, in the absence of auditory inputs, deaf. Nat Neurosci. 2010;13:1421. superior temporal regions can take on functions other than sensory 12. Twomey T, Waters D, Price CJ, Evans S, MacSweeney M. How processing. This could be either by preserving a function these auditory experience differentially influences the function of areas performed early in childhood or by taking on new functions left and right superior temporal cortices. J Neurosci. 2017;37: driven by influences from top-down projections from frontoparie- 9564–9573. tal areas or adjacent temporal and parietal regions. 13. Cardin V, Campbell R, MacSweeney M, Holmer E, Rönnberg J, Rudner M. Neurobiological insights from the study of deafness and sign language. In: Morgan G, ed. Understanding deafness, lan- guage and cognitive development. Essays in Honour of Bencie Woll. Acknowledgements Vol. 25. John Benjamins Publishing Company; 2020:159–181. The authors would like to specially thank all the deaf and hearing 14. Frasnelli J, Collignon O, Voss P, Lepore F. Crossmodal plasticity participants who took part in this study. in sensory loss. Progr Brain Res. 2011;191:233–249. Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3709 15. Heimler B, Striem-Amit E, Amedi A. Origins of task-specific 36. Penny WD, Friston KJ, Ashburner JT, Kiebel SJ, Nichols TE. sensory-independent organization in the visual and auditory Statistical parametric mapping: The analysis of functional brain brain: Neuroscience evidence, open questions and clinical im- images. Academic Press; 2011. plications. Curr Opin Neurobiol. 2015;35:169–177. 37. Fischl B. FreeSurfer. NeuroImage. 2012;62:774–781. 16. Kral A. Unimodal and cross-modal plasticity in the ‘deaf’ audi- 38. Fischl B, Salat DH, Busa E, et al. Whole brain segmentation: tory cortex. Int J Audiol. 2007;46:479–493. Automated labeling of neuroanatomical structures in the hu- 17. Merabet LB, Pascual-Leone A. Neural reorganization following man brain. Neuron. 2002;33:341–355. sensory loss: the opportunity of change. Nat Rev Neurosci. 39. Dale AM, Fischl B, Sereno MI. Cortical surface-based analysis: 2010;11:44–52. I. Segmentation and surface reconstruction. NeuroImage. 1999; 18. Ricciardi E, Bottari D, Ptito M, Roder B, Pietrini P. The sensory- 9:179–194. deprived brain as a unique tool to understand brain develop- 40. Brett M, Anton J-L, Valabregue R, Poline J-B. Region of interest ment and function. Neurosci Biobehav Rev. 2020;108:78–82. analysis using the MarsBar toolbox for SPM 99. NeuroImage. 19. Land R, Baumhoff P, Tillein J, Lomber SG, Hubka P, Kral A. 2002;16:S497. Cross-modal plasticity in higher-order auditory cortex of con- 41. JASP Team. JASP (Version 0.14.1) [Computer software]. genitally deaf cats does not limit auditory responsiveness to Published online 2020. cochlear implants. J Neurosci. 2016;36:6175–6185. 42. Cormier K, Schembri A, Vinson D, Orfanidou E. First language 20. Bottari D, Caclin A, Giard MH, Pavani F. Changes in early cortical acquisition differs from second language acquisition in prelin- visual processing predict enhanced reactivity in deaf indivi- gually deaf signers: Evidence from sensitivity to grammaticality duals. PloS ONE. 2011;6:e25607. judgement in British Sign Language. Cognition. 2012;124:50–65. 21. Pavani F, Bottari D. Visual abilities in individuals with profound 43. Linebarger MC, Schwartz MF, Saffran EM. Sensitivity to gram- deafness: A critical review. In: Murray MM, Wallace MT, eds. The matical structure in so-called agrammatic aphasics. Cognition. neural bases of multisensory processes. CRC Press/Taylor & Francis; 1983;13:361–392. 2012:421–446. 44. Boudreault P, Mayberry RI. Grammatical processing in 22. Wechsler D. WASI: Wechsler abbreviated scale of intelligence. The American Sign Language: Age of first-language acquisition ef- Psychological Corporation; 1999. fects in relation to syntactic structure. Language Cogn Process. 23. Corsi P. Memory and the medial temporal region of the 2006;21:608–635. brain. Unpublished doctoral dissertation. McGill University; 1972. 45. Emmorey K. Language, cognition, and the brain: Insights from sign 24. Mueller ST, Piper BJ. The psychology experiment building lan- language research. Lawrence Erlbaum Associates; 2002. guage (PEBL) and PEBL test battery. J Neurosci Methods. 2014; 46. Hauser PC, Lukomski J, Hillman T. Development of deaf and 222:250–259. hard-of-hearing students’ executive function. Deaf Cogn: Found 25. Rubinstein JS, Meyer DE, Evans JE. Executive control of cognitive Outcomes. 2008;286:308. processes in task switching. J Exp Psychol: Hum Percept Perform. 47. Botting N, Jones A, Marshall C, Denmark T, Atkinson J, Morgan G. 2001;27:763–797. Nonverbal executive function is mediated by language: A study of 26. Rushworth MFS, Hadland KA, Paus T, Sipila PK. Role of the hu- deaf and hearing children. Child Develop. 2017;88:1689–1700. man medial frontal cortex in task switching: A combined fMRI 48. Marshall C, Jones A, Denmark T, et al. Deaf children’s non-verbal and TMS study. J Neurophysiol. 2002;87:2577–2592. working memory is impacted by their language experience. 27. Fedorenko E, Behr MK, Kanwisher N. Functional specificity for Front Psychol. 2015;6:527. high-level linguistic processing in the human brain. Proc Natl 49. Zatorre R, Belin P, Penhune VB. Structure and function of audi- Acad Sci USA. 2011;108:16428–16433. tory cortex: Music and speech. Trends Cogn Sci. 2002;6:37–46. 28. Fedorenko E, Duncan J, Kanwisher N. Broad domain generality 50. Kaas JH, Hackett TA, Tramo MJ. Auditory processing in primate in focal regions of frontal and parietal cortex. Proc Natl Acad cerebral cortex. Curr Opin Neurobiol. 1999;9:164–170. Sci USA. 2013;110:16616–16621. 51. Monsell S. Task switching. Trends Cogn Sci. 2003;7:134–140. 29. Morris RG, Ahmed S, Syed GM, Toone BK. Neural correlates of 52. Ravizza SM, Carter CS. Shifting set about task switching: planning ability: Frontal lobe activation during the Tower of Behavioral and neural evidence for distinct forms of cognitive London test. Neuropsychologia. 1993;31:1367–1378. flexibility. Neuropsychologia. 2008;46:2924–2935. 30. van den Heuvel OA, Groenewegen HJ, Barkhof F, Lazeron RHC, 53. Miyake A, Friedman NP, Emerson MJ, Witzki AH, Howerter A, van Dyck R, Veltman DJ. Frontostriatal system in planning com- Wager TD. The unity and diversity of executive functions and plexity: A parametric functional magnetic resonance version of their contributions to complex “frontal lobe” tasks: A latent Tower of London task. NeuroImage. 2003;18:367–374. variable analysis. Cogn Psychol. 2000;41:49–100. 31. Kelly A, Milham M. Simon task. Standford Digital Repository. 54. Shiell MM, Champoux F, Zatorre RJ. The right hemisphere pla- Available at: http://purl.stanford.edu/zs514nn4996 and https:// num temporale supports enhanced visual motion detection openfmri.org/dataset/ds000101/. Published online 2016. ability in deaf people: Evidence from cortical thickness. Neural 32. Peirce JW. PsychoPy—Psychophysics software in Python. J Plasticity. 2016;2016:7217630. Neurosci Methods. 2007;162:8–13. 55. Corbetta M, Shulman GL. Control of goal-directed and stimulus- 33. Fessler JA, Lee S, Olafsson VT, Shi HR, Noll DC. Toeplitz-based driven attention in the brain. Nat Rev Neurosci. 2002;3:201–215. iterative image reconstruction for MRI with correction for mag- 56. Geng JJ, Mangun GR. Right temporoparietal junction activation netic field inhomogeneity. IEEE Trans Signal Process. 2005;53: by a salient contextual cue facilitates target discrimination. 3393–3402. NeuroImage. 2011;54:594–601. 34. Funai AK, Fessler JA, Yeo DTB, Olafsson VT, Noll DC. Regularized 57. Lemire-Rodger S, Lam J, Viviano JD, Stevens WD, Spreng RN, field map estimation in MRI. IEEE Trans Med Imaging. 2008;27: Turner GR.Inhibit,switch, andupdate:A within-subject fMRI in- 1484–1494. vestigation of executive control. Neuropsychologia. 2019;132:107134. 35. Jezzard P, Balaban RS. Correction for geometric distortion in 58. Qiao L, Zhang L, Chen A, Egner T. Dynamic trial-by-trial recoding echo planar images from B0 field variations. Magn Reson Med. of task-set representations in the frontoparietal cortex mediates 1995;34:65–73. behavioral flexibility. J Neurosci. 2017;37:11037–11050. 3710 | BRAIN 2022: 145; 3698–3710 B. Manini et al. 59. Engelhardt LE, Harden KP, Tucker-Drob EM, Church JA. The 65. Pellicano E. The development of executive function in autism. neural architecture of executive functions is established by Autism Res Treat. 2012; 2012:146132. middle childhood. NeuroImage. 2019;185:479–489. 66. Zelazo PD, Müller U, Frye D, et al. The development of executive 60. Cazalis F, Valabregue R, Pélégrini-Issac M, Asloun S, Robbins function in early childhood. Monogr Soc Res Child Dev. 2003;68:vii-137. TW, Granon S. Individual differences in prefrontal cortical acti- 67. Freel BL, Clark MD, Anderson ML, Gilbert GL, Musyoka MM, vation on the Tower of London planning task: Implication for ef- Hauser PC. Deaf individuals’ bilingual abilities: American Sign fortful processing. Eur J Neurosci. 2003;17:2219–2225. Language proficiency, reading skills, and family characteristics. 61. Just MA, Carpenter PA, Keller TA, Eddy WF, Thulborn KR. Brain Psychology. 2011;2:18–23. activation modulated by sentence comprehension. Science. 68. Blanco-Elorrieta E, Pylkkänen L. Ecological validity in bilingual- 1996;274:114–116. ism research and the bilingual advantage. Trends Cogn Sci. 2018; 62. Figueras B, Edwards L, Langdon D. Executive function and lan- 22:1117–1126. guage in deaf children. J Deaf Stud Deaf Educ. 2008;13:362–377. 69. Prior A, Gollan TH. Good language-switchers are good task- 63. Hall ML, Eigsti IM, Bortfeld H, Lillo-Martin D. Executive function switchers: evidence from Spanish-English and Mandarin- in deaf children: Auditory access and language access. J Speech, English bilinguals. J Int Neuropsychol Soc. 2011;17:682–691. Language, Hearing Res. 2018;61:1970–1988. 70. Prior A, MacWhinney B. A bilingual advantage in task switching. 64. Hall ML, Eigsti IM, Bortfeld H, Lillo-Martin D. Auditory depriv- Bilingualism: Language Cogn. 2010;13:253–262. ation does not impair executive function, but language depriv- 71. Nava E, Bottari D, Zampini M, Pavani F. Visual temporal order ation might: Evidence from a parent-report measure in deaf judgment in profoundly deaf individuals. Exp Brain Res. 2008; native signing children. The J Deaf Stud Deaf Educ. 2017;22:9–21. 190:179–188. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Brain Oxford University Press

Sensory experience modulates the reorganization of auditory regions for executive processing

Loading next page...
 
/lp/oxford-university-press/sensory-experience-modulates-the-reorganization-of-auditory-regions-Lo8sKZaA6k

References (84)

Publisher
Oxford University Press
Copyright
© The Author(s) 2022. Published by Oxford University Press on behalf of the Guarantors of Brain.
ISSN
0006-8950
eISSN
1460-2156
DOI
10.1093/brain/awac205
Publisher site
See Article on Publisher Site

Abstract

https://doi.org/10.1093/brain/awac205 BRAIN 2022: 145; 3698–3710 | 3698 Sensory experience modulates the reorganization of auditory regions for executive processing 1,† 2,† 1 3 Barbara Manini, Valeria Vinogradova, Bencie Woll, Donnie Cameron, 4 1 Martin Eimer and Velia Cardin These authors contributed equally to this work. Crossmodal plasticity refers to the reorganization of sensory cortices in the absence of their typical main sensory input. Understanding this phenomenon provides insights into brain function and its potential for change and enhancement. Using functional MRI, we investigated how early deafness influences crossmodal plasticity and the organization of ex- ecutive functions in the adult human brain. Deaf (n= 25; age: mean= 41.68, range= 19–66, SD= 14.38; 16 female, 9 male) and hearing (n= 20; age: mean= 37.50, range= 18–66, SD= 16.85; 15 female, 5 male) participants performed four visual tasks tapping into different components of executive processing: task switching, working memory, planning and inhib- ition. Our results show that deaf individuals specifically recruit ‘auditory’ regions during task switching. Neural activity in superior temporal regions, most significantly in the right hemisphere, are good predictors of behavioural perform- ance during task switching in the group of deaf individuals, highlighting the functional relevance of the observed cor- tical reorganization. Our results show executive processing in typically sensory regions, suggesting that the development and ultimate role of brain regions are influenced by perceptual environmental experience. 1 Deafness, Cognition and Language Research Centre and Department of Experimental Psychology, UCL, London WC1H 0PD, UK 2 School of Psychology, University of East Anglia, Norwich NR4 7TJ, UK 3 Norwich Medical School, University of East Anglia, Norwich NR4 7TJ, UK 4 Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK Correspondence to: Velia Cardin Deafness, Cognition and Language Research Centre and Department of Experimental Psychology UCL, London WC1H 0PD, UK E-mail: velia.cardin@ucl.ac.uk Keywords: deafness; executive function; auditory cortex Abbreviations: BSL= British Sign Language; BSLGJT= BSL grammaticality judgement task; EGJT= English grammaticality judgement task; HEF= higher executive function; HG= Heschl’s gyrus; LEF= lower executive function; pSTC= posterior superior temporal cortex; PT= planum temporale; ROI= region of interest individuals to understand the influence of early sensory experience Introduction on higher-order cognition and neural reorganization. Sensory systems feed and interact with all aspects of cognition. As Executive functions are higher-order cognitive processes re- such, it is likely that developmental sensory experience will affect sponsible for flexible and goal-directed behaviours, which have the organization of higher-order cognitive processes such as execu- been associated with activity in frontoparietal areas of the brain. tive functions. Here we studied executive processing in early deaf However, studies on deafness have shown reorganization for visual Received January 26, 2022. Revised April 20, 2022. Accepted May 20, 2022. Advance access publication June 2, 2022 © The Author(s) 2022. Published by Oxford University Press on behalf of the Guarantors of Brain. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which per- mits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited. Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3699 working memory in regions typically considered to be part of the were approved by the ethics committee of the School of 2–5 auditory cortex. These working memory responses in auditory Psychology at the University of East Anglia and the Norfolk and regions suggest that, in the absence of early sensory stimulation, Norwich University Hospital Research and Development a sensory region can change its function as well as the perceptual department. 6,7 modality to which it responds. The adaptation of sensory brain Participants were recruited through public events, social media regions to processing information from a different sensory modal- and participant databases of the University College London 7–19 ity is known as crossmodal plasticity. In deaf individuals, cross- Deafness, Cognition and Language Research Centre and the modal plasticity often refers to responses to visual or University of East Anglia School of Psychology. Participants were somatosensory stimuli in regions of the superior temporal cortex all right-handed (self-reported), had full or corrected vision and that in hearing individuals are typically involved in processing no history of neurological conditions. All participants were com- 7–11,14–19 sounds. The common assumption here, and in general pensated for their time, travel and accommodation expenses. when referring to crossmodal plasticity, is that the auditory cortex will preserve its sensory processing function, but process a differ- General procedure ent type of sensory input. The presence of working memory re- sponses in the auditory regions of deaf individuals takes the Participants took part in one behavioural and one scanning session. concept of crossmodal plasticity further, suggesting that, in the ab- The sessions took place on the same or different days. sence of early auditory stimulation, there is a shift from sensory to The behavioural session included: (i) Standardized reasoning and working memory tests: the Block Design cognitive processing in such regions. If this is the case, it would sug- subtest of the Wechsler Abbreviated Scale of Intelligence and the gest that cortical functional specialization for sensory or cognitive Corsi Block-tapping test implemented in psychology experiment build- processing is partially driven by environmental sensory experi- ing language software (PEBL, http://pebl.sourceforge.net/). ence. The aim of our study is to elucidate the role of the auditory (ii) Language tasks: four tasks were administered to assess language profi- cortex of deaf individuals in executive functions, to understand ciency in English and BSL in deaf participants (see the ‘Language assess- how sensory experience impacts cognitive processing in the brain. ment’ section). Specifically, we tested whether the auditory regions of deaf indivi- (iii) Pre-scanning training: the training session ensured that participants duals are involved in cognitive control or whether they have a role understood the tasks and reached accuracy of at least 75%. The tasks in specific subcomponents of executive functions. were explained in the participant’s preferred language (English or BSL). To address our aims, we conducted a functional MRI experiment A written description of all the tasks was provided to all participants (deaf and hearing) to support the experimenter’s explanation. in deaf and hearing individuals. Participants performed tasks tap- (iv) Audiogram screening: pure-tone averages were used to measure the de- ping into different executive functions: switching, working mem- gree of deafness in deaf participants. Copies of audiograms were pro- ory, planning and inhibition. If the auditory cortex of deaf vided by the participants from their audiology clinics or were collected individuals has a role in cognitive control, we would expect all tasks at the time of testing using a Resonance R17 screening portable audiom- to recruit this region. However, if the auditory areas of deaf indivi- eter. Participants included in the study had a mean pure-tone average duals are involved in specific subcomponents of executive func- >75 dB averaged across the speech frequency range (0.5, 1, 2 kHz) in tioning, these regions will be differentially activated by each of both ears (mean= 93.66± 7.79 dB; range: 78.33–102.5 dB). Four partici- the tasks. If neural activity in the reorganized auditory cortex can pants did not provide their audiograms, but they were all congenitally se- predict behavioural performance in deaf individuals, this will cor- verely or profoundly deaf and communicated with the researchers using 20,21 roborate the functional significance of such plasticity effect. BSL or relying on lipreading. During the scanning session, functional MRI data were acquired while participants performed four visual executive function tasks Materials and methods on switching, working memory, planning and inhibition (see de- Participants tails next). The order of the tasks was counterbalanced across participants. There were two groups of participants (see demographics in Supplementary Tables 1 and 2). (i) Twenty-nine congenitally or early (before 3 years of age) Experimental design severely-to-profoundly deaf individuals whose first language is British Sign Language (BSL) and/or English (Supplementary Table 3). We recruited All tasks were designed so that each had one condition with higher a larger number of deaf participants to reflect the language variability of executive demands (higher executive function, HEF) and one with the deaf population in the UK, as discussed in the ‘Language assessment’ lower demands (lower executive function; LEF) (Fig. 1). section. Datasets from three deaf participants were excluded from all For the switching task, participants had to respond to the shape analyses due to excessive motion in the scanner. One participant was ex- 25,26 of geometric objects, i.e. a rectangle and a triangle (Fig. 1). At the cluded because they only had a mild hearing loss in their best ear (pure- beginning of the run, participants were instructed to press a key tone average <25 dB). In total, 25 deaf participants were included in the with their left hand when they saw a rectangle and with their right analysis of at least one executive function task (see Supplementary hand when they saw a triangle. Each block started with a cue indi- Table 4 for details on exclusion). cating that the task was to either keep the rule they used in the pre- (ii) Twenty hearing individuals who are native speakers of English with no knowledge of any sign language. vious block (‘stay’ trials; LEF) or to switch it (‘switch’ trials; HEF). In the switch trials, participants had to apply the opposite mapping Deaf and hearing participants were matched on age, between the shape and the response hand. Each block included gender, reasoning and visuospatial working memory span the presentation of the instruction cue (200 ms), a fixation cross (Supplementary Table 2). (500 ms) and two to five task trials. During each trial, a geometrical All participants gave written informed consent. All procedures shape (either a blue rectangle or a blue triangle) was shown at the followed the standards set by the Declaration of Helsinki and centre of the screen until the participant responded for a max of 3700 | BRAIN 2022: 145; 3698–3710 B. Manini et al. Figure 1 Executive function tasks. Each task had a higher executive demands condition (HEF, purple) and a lower executive demands condition (LEF, peach). See the ‘Materials and methods’ section for details of the design. 1500 ms. Visual feedback (500 ms) followed the participant’s re- hand when their choice was on the right. The maximum display sponse. There were 230 trials in 80 blocks of either the LEF (40) or time for each stimulus was 30 s. The duration of the inter-trial HEF (40) condition. The analysis for the HEF condition only included interval was jittered between 2000 and 3500 ms. There were 30 trials the first trial of the switch block (see ‘Statistical analysis of behav- in the Tower of London condition and 30 trials in the control ioural performance’ section). condition. 27,28 31 For the working memory task, we used a visuospatial task con- To study inhibitory control, we used Kelly and Milham’s ver- trasted with a perceptual control task (Fig. 1). A visual cue (1500 ms) in- sion of the classic Simon task (https://exhibits.stanford.edu/data/ dicated which task participants should perform. The cue was followed catalog/zs514nn4996). A square appeared on the left or the right bya3× 4 grid. Black squares were displayed two at a time at random side of the fixation cross. The colour of the squares was the relevant locations on the grid for 1000 ms, three times. In the HEF condition, aspect of the stimuli, with their position irrelevant for the task. participants were asked to memorize the six locations. Then they indi- Participants were instructed to respond to the red square with the cated their cumulative memory for these locations by choosing be- left hand and the green square with the right hand. In the congruent tween two grids in a two-alternative, forced-choice paradigm via a condition (LEF), the button press response was spatially congruent button press. The response grids were displayed until the participant with the location of the stimuli (e.g. the right-hand response for a responded, or for a maximum of 3750 ms. In the control condition square appearing on the right side of the screen) (Fig. 1). In the incon- (LEF), participants indicated whether a blue square was present in gruent condition (HEF), the correct answer was in the opposite loca- anyofthe grids, ignoring theconfiguration of the highlighted squares. tion in respect to the stimulus. Half of the trials were congruent, and Trials were separated by an inter-trial interval with duration jittered half were incongruent. Each stimulus was displayed for 700 ms, between 2000 and 3500 ms. Each experimental run had 30 working with a response window of up to 1500 ms. The inter-trial memory trials and 30 control trials. interval was 2500 ms for most trials, with additional blank intervals For the planning task, we used a computer version of the classic of 7.5 s (20), 12.5 s (2) and 30 s (1). Participants completed one or two 29,30 Tower of London task (Fig. 1). In each trial, two configurations of runs of this task, each consisting of a maximum of 200 trials. coloured beads placed on three vertical rods appeared on a grey screen, with the tallest rod containing up to three beads, the middle Statistical analysis of behavioural performance rod containing up to two beads and the shortest rod containing up to one bead. In the Tower of London condition (HEF), participants had Averaged accuracy (%correct) and reaction time were calculated. to determine the minimum number of moves needed to transform For each participants’ set of reaction times, we excluded outlier va- the starting configuration into the goal configuration following lues where participants responded too quickly or where they took a two rules: (i) only one bead can be moved at a time; and (ii) a bead long time to respond. We did this by calculating each participant’s cannot be moved when another bead is on top. There were four le- interquartile range separately, and then removing values that were vels of complexity, depending on the number of moves required >1.5 interquartile ranges below the first quartile or above the third (2, 3, 4 and 5). In the control condition (LEF), participants were asked quartile of the data series. Differences between groups on accuracy to count the number of yellow and blue beads in both displays. For or reaction time were investigated with repeated-measures both conditions, two numbers were displayed at the bottom of the ANOVAs with between-subjects factor group (hearing, deaf) and screen: one was the correct response and the other was incorrect within-subjects factor condition (LEF, HEF). by +1 or −1. Participants answered with their left hand when they In the switching task, the accuracy switch cost (SwitchCost ) ACC chose the number on the left side of the screen, and with their right was calculated as the difference in the percentage of errors (% Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3701 errors) between the first switch trial of a switch block and all stay obtained during this step was used for normalization of the function- trials. reaction time switch cost (SwitchCost ) was calculated as al scans. Susceptibility distortions in the echo-planar images were RT the difference in reaction time between the first switch trial of a estimated using a field map that was coregistered to the blood oxy- 33,34 switch block and all stay trials. genation level-dependent (BOLD) reference. Images were rea- In the inhibition task, the Simon effect was calculated as the dif- ligned using the precalculated phase map, coregistered, slice-time ference in %errors or reaction time between the incongruent and corrected, normalized and smoothed (using an 8-mm full-width at congruent trials. half-maximum Gaussian kernel). All functional scans were checked for motion and artefacts using the ART toolbox (https://www.nitrc. org/projects/artifact_detect). Image acquisition Images were acquired at the Norfolk and Norwich University Functional MRI first-level analysis Hospital in Norwich, UK, using a 3 T wide bore GE 750W MRI scanner and a 64-channel head coil. Communication with the deaf partici- The first-level analysis was conducted by fitting a general linear pants occurred in BSL through a close-circuit camera, or through model with regressors of interest for each task (see details next). written English through the screen. An intercom was used for com- All the events were modelled as a boxcar and convolved with munication with hearing participants. All volunteers were given SPM’s canonical haemodynamic response function. The motion ear protectors. Stimuli were presented with PsychoPy software parameters, derived from the realignment of the images, were (https://psychopy.org) through a laptop (MacBook Pro, Retina, added as regressors of no interest. Regressors were entered into a 15-inch, Mid 2015). All stimuli were projected by an AVOTEC’s multiple regression analysis to generate parameter estimates for Silent Vision projector (https://www.avotecinc.com/high- each regressor at every voxel. resolution-projector) onto a screen located at the back of the mag- For the switching task, the first trial of each switch block (HEF) net’s bore. Participants watched the screen through a mirror and all stay trials (LEF) were modelled as regressors of interest sep- mounted on the head coil. Button responses were recorded via arately for the left- and right-hand responses. The cues and the re- fibre-optic response pad button boxes (https://www.crsltd.com/ maining switch trials were included as regressors of no interest. tools-for-functional-imaging/mr-safe-response-devices/forp/). For the working memory task, the conditions of interest were work- Functional imaging data were acquired using a gradient-recalled ing memory (HEF) and control (LEF). The onset was set at the presenta- echo planar imaging sequence [50 slices, repetition time (TR)= tion of the first grid, with the duration set at 3.5 s (i.e. the duration of 3000 ms, echo time (TE)= 50 ms, field of view (FOV)= 192× the three grids plus a 500 ms blank screen before the appearance of 192 mm, 2 mm slice thickness, distance factor 50%] with an in- the response screen; Fig. 1). Button responses were included separately plane resolution of 3 × 3 mm. The protocol included six functional for each hand and condition as regressors of no interest. scans: five task-based functional MRI scans (switching: 10.5 min, For the planning task, the Tower of London (HEF) and control 210 volumes; working memory: 11 min, 220 volumes; planning: (LEF) conditions were included in the model as regressors of inter- 11.5 min, 230 volumes; inhibition: two runs of 10 min, 200 volumes est, with onsets at the beginning of each trial and duration set to the each) and one resting state scan (part of a different project, and to trial-specific reaction time. Button responses were modelled separ- be reported in a different paper). Some participants did not com- ately for each hand as regressors of no interest. plete all functional scans (Supplementary Table 4). An anatomical For the inhibition task, four regressors of interest were obtained T -weighted scan [IR-FSPGR, inversion time= 400 ms, 1 mm slice by combining the visual hemifield where the stimulus appeared thickness] with an in-plane resolution of 1 × 1 mm was acquired with the response hand [(i) right visual hemifield—left hand; (ii) during the session. left visual hemifield—right hand; (iii) right visual hemifield—right Raw B0 field map data were acquired using a 2D multi-echo hand; (iv) left visual hemifield—left hand]. Right visual hemifield— gradient-recalled echo sequence with the following parameters: TR = left hand and left visual hemifield—right hand were the incongruent 700 ms, TE= 4.4 and 6.9 ms, flip angle= 50°, matrix size= 128 × 128, conditions (HEF), whereas the right visual hemifield-right hand and FOV= 240 mm × 240 mm, number of slices= 59, thickness= 2.5 mm left visual hemifield-left hand were the congruent conditions (LEF). and gap= 2.5 mm. Real and imaginary images were reconstructed 33–35 for each TE to permit calculation of B0 field maps in Hz. Region of interest analysis We conducted a region of interest (ROI) analysis to investigate Functional MRI preprocessing crossmodal plasticity and differences between groups in the audi- Functional MRI data were analysed with MATLAB 2018a tory cortex. Three auditory regions of the superior temporal cortex (MathWorks, MA, USA) and Statistical Parametric Mapping soft- were included in this analysis: Heschl’s gyrus (HG), the planum ware (SPM12; Wellcome Trust Centre for Neuroimaging, London, temporale (PT) and the posterior superior temporal cortex (pSTC) UK). The anatomical scans were segmented into different tissue (Fig. 2). HG and the PT were defined anatomically, using classes: grey matter, white matter and CSF. Skull-stripped anatom- FreeSurfer software (https://surger.nmr.mgh.harvard.edu). Full 38,39 ical images were created by combining the segmented images using descriptions of these procedures can be found elsewhere, but, the Image Calculation function in SPM (ImCalc, http://tools. in short, each participant’s bias-corrected anatomical scan was robjellis.net). The expression used was: {[i1.*(i2 + i3 + i4)]> thresh- parcellated and segmented, and voxels with the HG label and the old}, where i1 was the bias-corrected anatomical scan and i2, i3 PT label were exported using SPM’s ImCalc function (http:// and i4 were the tissue images (grey matter, white matter and CSF, robjellis.net/tools/imcalc_documentation.pdf). Participant-specific respectively). The threshold was adjusted between 0.5 and 0.9 to ROIs were then normalized to the standard MNI space using the de- achieve adequate brain extraction for each participant. Each parti- formation field from the normalization step of the preprocessing. cipant’s skull-stripped image was normalized to the standard MNI The pSTC was specified following findings from Cardin et al. space (Montreal Neurological Institute) and the deformation field where a visual working memory crossmodal plasticity effect was 3702 | BRAIN 2022: 145; 3698–3710 B. Manini et al. found in right and left pSTC in deaf individuals [left: −59, −37, 10; were analysed using a backward data entry method in JASP. The right: 56, −28, −1]. Right and left functional pSTC ROIs were defined default stepping method criteria were used, where predictors with using data from Cardin et al., with the contrast [deaf (working mem- P < 0.05 are entered into the model and those with P> 0.1 are re- ory> control task)> hearing (working memory> control task)] (P< moved until all predictors fall within these criteria. SwitchCost RT 0.005, uncorrected). and SwitchCost were entered as dependent variables in separate ACC There was an average partial overlap of 8.2 voxels (SD = 6.86) be- analyses. Each regression analysis had three covariates: neural tween left PT and left pSTC, with no significant difference in overlap switch cost in the right hemisphere, neural switch cost in the left between groups (deaf: mean= 9.92, SD= 7.02; hearing: mean= 6.05, hemisphere and language. SD= 6.17). To ensure that the two ROIs were independent, common Neural switch cost (BOLD − BOLD ) was calculated in switch stay voxels were removed from left PT in a subject-specific manner. ROIs with significant differences between the switch and stay con- Removing the overlapping voxels did not qualitatively change the dition in the deaf group. The average neural activity in all stay trials results. (BOLD ) was subtracted from the average activity in the first stay Parameter estimates for each participant were extracted from switch trials (BOLD ), and then averaged across ROIs separately switch each ROI using MarsBaR v.0.44 (http://marsbar.sourceforge.net). in the right and left hemisphere. The data were analysed using JASP (https://jasp-stats.org) and en- tered into separate mixed repeated measures ANOVAs for each Data availability task and set of ROIs. Factors in the ANOVAs on the temporal ROIs The data and analysis files for this paper can be found at https://osf. included: the between-subjects factor group (hearing, deaf) and io/uh2ap/. the within-subjects factors ROI (HG, PT, pSTC), hemisphere (left, right) and condition (LEF, HEF). The Greenhouse–Geisser correction was applied when the Results assumption of sphericity was violated. Significant interactions Behavioural results and effects of interest were explored with Student’s t-tests or Mann–Whitney U-tests when the equal variance assumption Deaf (n= 25) and hearing (n= 20) individuals were scanned while was violated. performing four executive function tasks: switching, working memory, planning and inhibition (Fig. 1). Behavioural results from all tasks are shown in Fig. 3. To explore differences in perform- Language assessment ance between groups, we conducted 2 × 2 repeated-measures We recruited a representative group of the British deaf population, ANOVAs for each task, with either accuracy or reaction time as who usually have different levels of proficiency in sign and spoken the dependent variable, between-subjects factor group (hearing, language. This was: (i) to study plasticity in a representative group deaf) and within-subjects factor condition (HEF, LEF). Results of deaf individuals; and (ii) to study the relationship between lan- show a significant main effect of condition for both accuracy and guage experience and the organization of cognitive networks of reaction time in all tasks, confirming that the HEF condition was the brain, which will be reported in a separate paper. more difficult and demanding than the LEF condition To assess the language proficiency of deaf participants, we chose (Supplementary Table 5). grammaticality judgement tests measuring language skills in The group of deaf individuals had significantly slower reaction English and BSL. The BSL grammaticality judgement task (BSLGJT) times in all tasks (Supplementary Table 5). Switching was the is described in Cormier et al. and the English grammaticality only task where there was a significant main effect of group on ac- judgement task (EGJT) was designed based on examples from curacy [F(1,41) = 4.32, P= 0.04, η = 0.09], as well as a Condition× Linebarger et al. The BSLGJT and the EGJT use a single method of Group interaction [F(1,41) = 4.98, P= 0.03, η = 0.11]. A post hoc t-test assessing grammaticality judgements of different syntactic struc- revealed a significant between-groups difference, where the group tures in English and BSL. Grammaticality judgement tests have of deaf individuals was significantly less accurate than the group of been used in deaf participants before and have proved to be efficient hearing individuals in the switch condition [t(41)=−2.22, P= in detecting differences in language proficiency among participants 0.03, Cohen’s d = 0.68]. The difference in SwitchCost (% ACC 42,44 with varying ages of acquisition. Deaf participants performed errors − %errors )reflects the significant interaction, with switch stay both the BSL and English tests if they knew both languages, or the deaf group [mean = 10.24, SD= 9.89, t(22) = 4.96, P < 0.001, d= only the English tests if they did not know BSL. 1.03] having a larger SwitchCost than the hearing group [mean ACC To control for potential language proficiency effects, we com- = 4.18; SD= 7.53, t(19) = 2.49, P = 0.02, d = 0.56; Fig. 3A]. bined results from the EGJT and BSLGJT to create a single, modality-independent measure of language proficiency in the Functional MRI results deaf group. Accuracy scores in the EGJT (%correct; mean= 83.51, SD= 11.4, n= 25) and BSLGJT (mean = 77.88, SD= 13.1, n= 21) were Functional MRI results show that all executive function tasks ac- transformed into z-scores separately for each test. For each partici- tivated typical frontoparietal regionsinbothgroupsofpartici- pant, the EGJT and BSLGJT z-scores were then compared, and the pants (Supplementary Fig. 2). Thereweresignificantly stronger higher one was chosen for a combined modality-independent lan- activations in the HEF condition in the switching, working mem- guage proficiency score (Supplementary Fig. 1). ory and planning tasks. These included commonly found activa- tions in frontoparietal areas, such as dorsolateral prefrontal cortex, frontal eye fields, presupplementary motor area and in- Multiple linear regression traparietal sulcus. In the inhibition task, the HEF incongruent Multiple linear regression analyses were conducted to investigate condition resulted in stronger activation in intraparietal sulcus whether neural activity in the superior temporal cortex of deaf indi- and left frontal eye fields, but there were no significant differ- viduals can predict performance in the switching task. The data ences between conditions. Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3703 Figure 2 Temporal ROIs. Temporal regions included in the analysis: HG, PT and superior temporal cortex (pSTC). HG and PT were defined anatomically, in a subject-specific manner, using the FreeSurfer software package. The figure shows the overlap of all subject-specific ROIs. Common voxels be- tween left PT and left pSTC have been subtracted from left PT (see ‘Materials and methods’ section). The pSTC was defined functionally, based on the findings of Cardin et al. (see the ‘Materials and methods’ section). To investigate crossmodal plasticity and executive processing in during the switching task. We conducted two separate multiple lin- the auditory cortex of deaf individuals, we conducted a ROI analysis ear regression analyses, one with SwitchCost and one with RT on superior temporal auditory ROIs. These included: HG, the PT and SwitchCost as dependent variables (Table 2). The covariates in- ACC the pSTC (Fig. 2). Differences and interactions between groups are cluded in the model were: right hemisphere neural switch cost, left discussed next, and we first present results from the switching hemisphere neural switch cost and language z-scores. For the neur- task where we observed the strongest activations of temporal al swich cost covariates, data was averaged from ROIs in the right ROIs in the deaf group (Fig. 4). Results from all other tasks are dis- and left hemisphere to reduce the number of dimensions in the cussed in the following subsection. multiple linear regression models. To do this, we calculated the neural switch cost (BOLD − BOLD ) for each ROI with signifi- switch stay cant differences in activity between the switch and stay conditions Task switching activates auditory areas in deaf in the deaf group (Fig. 4 and Supplementary Table 6), and we then individuals and this activation predicts behaviour averaged neural switch cost separately for ROIs in the right and Of the four tasks that we tested, only in the switching task we found left hemisphere. We also included language as a covariate in our both a significant main effect of group [F(1,41)= 15.48, P < 0.001, η = models because language proficiency has been shown to modulate 45–48 0.27] and a significant interaction between Group × Condition performance in executive function tasks in deaf individuals. [F(1,41) = 4.75, P= 0.03, η = 0.10] (Table 1). The interaction was dri- Results from the multiple linear regression analysis using back- ven by a significant difference between conditions in the deaf ward data entry show that neural activity in temporal ROIs can sig- group, but not in the hearing group [deaf : t(22) = 4.06, P= nificantly predict SwitchCost in the deaf group (Table 2). The HEFvLEF RT <0.001, d = 0.85; hearing : t(19)= 0.26, P = 0.79, d = 0.06]. To most significant model included both right and left hemisphere HEFvLEF test whether differences between conditions were significant be- neural switch cost as covariates, and explained 40.6% of the 2 2 tween the switch and stay condition in all ROIs, we conducted variance [F(2,18) = 6.15, P= 0.009, R = 0.41, adjusted R = 0.34; post hoc t-tests in each ROI and group. This accounted for a total Table 2, top section]. There was a positive association between of 12 separate t-tests, and to correct for multiple comparisons, we SwitchCost and neural switch cost in right hemisphere temporal RT only considered significant those results with P < 0.004 (P < 0.05/12 areas (B= 0.04, SE= 0.01, β= 0.99; P= 0.003). This means that for = 0.004; corrected P < 0.05). We found significant differences be- every unit increase in neural switch cost in right temporal areas, tween the switch and stay condition in all the left hemisphere there is an increase of 40 ms in SwitchCost . In standardized RT ROIs and in the right PT and right pSTC in the deaf group terms, as neural switch cost increases by 1 SD, SwitchCost RT (Fig. 4 and Supplementary Table 6). increases by 0.99 SDs. On the other hand, there was a negative To investigate the behavioural relevance of the observed cross- association between the left hemisphere neural and modal plasticity, we evaluated whether neural activity in the super- SwitchCost . However, this was only significant in the full model RT ior temporal cortex of deaf individuals can predict performance (P = 0.031, B = −0.02, SE= 0.01, β=−0.69), but not in the best 3704 | BRAIN 2022: 145; 3698–3710 B. Manini et al. Figure 3 Behavioural performance. The figure shows average accuracy (%correct) and reaction time (RT, in seconds) for each task and condition in the hearing and the deaf groups. It also shows the average switch costs and Simon effects for both accuracy and reaction time in each group. The SwitchCost and Simon effect are calculated and plotted using %error instead of %correct, so that larger values indicate an increase in cost. Only ACC the first trials of the switch blocks were included in the HEF condition. The bold lines in the box plots indicate the median. The lower and upper hinges correspond to the first and third quartiles. Statistically significant (P< 0.05) differences between conditions are not shown in the figure, but were found for all tasks in both groups (Supplementary Table 5). **P< 0.01; *P< 0.05. model (P = 0.05, B=−0.02, SE= 0.01, β=−0.61; Table 2). There was no decrease of 12.6 units in SwitchCost . In standardized terms, as ACC significant association between SwitchCost and language language z-scores increased by 1 SD, SwitchCost decreased by RT ACC (B= −0.06, SE = 0.05, β=−0.23; P = 0.22). 0.45 SDs. When evaluating whether neural switch cost could also predict SwitchCost , we found no significant association between these ACC Recruitment of auditory areas in deaf individuals is variables (Table 2, bottom section). Instead, the most significant not ubiquitous across executive function tasks model included only language as a regressor (Table 2), explaining 20.7% of the variance [F(1,19) = 4.96, P= 0.04, R = 0.21, adjusted Results from the working memory, planning and inhibition tasks R = 0.16]. For every unit increase in language z-scores, there is a are shown in Fig. 5. In the working memory task, there was a Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3705 Discussion We investigated how early sensory experience impacts the organ- ization of executive processing in the brain. We found that, in deaf individuals, primary and secondary auditory areas are re- cruited during a visual switching task. These results indicate that the sensory or cognitive specialization of cortical regions in the adult brain can be influenced by developmental sensory experi- ence. It is possible that an early absence of auditory inputs results in a shift of functions in regions typically involved in auditory pro- cessing, with these regions then adopting a role in specific compo- nents of executive processing. Neural activity in temporal regions during the switching task predicted performance in deaf indivi- duals, highlighting the behavioural relevance of this functional shift. Our design allowed us to thoroughly examine the role of audi- tory regions in different executive function tasks and determine whether these regions are involved in cognitive control. Previous studies have suggested an involvement of auditory cortex during 4,5 higher-order cognitive tasks in deaf individuals, but given the fo- cus on a single task, with an experimental and control condition, they cannot inform whether plasticity effects are specific to the de- mands of the task. Our design included four different visuospatial executive function tasks, all with an experimental (HEF) and control (LEF) condition, probing a variety of executive processes. We found that the HEF condition in all tasks recruited frontoparietal areas Figure 4 Switching task analysis. (A) Neural activity in temporal ROIs. typically involved in executive functioning and cognitive control. ***P< 0.005; ****P < 0.001. (B) Partial correlation plot between However, only switching resulted in significant activations in tem- SwitchCost and neural switch cost in right temporal ROIs in the group RT of deaf individuals. Partial correlation from a multiple linear model with poral auditory regions in the deaf group. This finding demonstrates SwitchCost as the dependent variable and the following covariates: RT that the auditory cortex of deaf individuals serves a specific sub- right hemisphere neural switch cost, left hemisphere neural switch component of executive functioning during switching, and not a cost, and language. shared computation across tasks, such as cognitive control. This was not only found in higher-order auditory areas, but also in the left HG, showing that a functional shift towards cognition can in- deed occur in primary sensory regions. A significant activation dur- significant Condition× Group interaction [Table 1, F(1,41) = 6.41, P= ing the switching condition in the left, but not the right HG, 0.01, η = 0.13], but differences between conditions within each provides further evidence for different roles of left and right tem- group were not significant [hearing : t(18) =−1.74, P = 0.10, d HEFvLEF poral regions in deaf individuals (see Cardin et al. for a review). =−0.40; deaf : t(23) = 1.81, P= 0.08, d = 0.37]. In the planning HEFvLEF Differences in the recruitment of the left and right HG in this study task, there was a significant main effect of group [F(1,38)= 5.85, P may be linked to the specialization of these regions for sound pro- = 0.02, η = 0.13], but this was driven by significant deactivations cessing in hearing individuals. In this group, left HG is specialized in the hearing group [t(18)=−4.47, P < 0.001, d =−1.00], with no sig- for the temporal processing of auditory signals, whereas the right nificant difference in activity from baseline in the deaf group HG shows stronger sensitivity to spectral components. The [t(20)=−1.31, P = 0.21, d = −0.29]. In the inhibition task, there was switching task in this study requires tracking a sequence of stimuli a significant interaction between ROI and Group [F(1.89,66.05= in time, while the extraction of spectral or frequency information is 3.92, P = 0.03, η = 0.10]. However, there were no significant differ- not needed in this task, which could explain the different recruit- ences between groups in any ROI (https://osf.io/9fuec). Instead, ment of HG across hemispheres. The fact that the right HG was the ROI × Group interaction was driven by a main effect of ROI in not recruited during the switching task, while right PT and pSTC the deaf group (higher activations for PT and pSTC than HG, were, also suggests a functional difference in crossmodal plasticity https://osf.io/2z35e/), which was not present in the hearing group between primary and secondary auditory regions. Primary auditory (https://osf.io/gmy6v/). Table 1 Group main effects and group interactions for all tasks in the ROI analysis Switching Working memory Planning Inhibition F (d.f.) PF (d.f.) PF (d.f.) PF (d.f.) P Group 15.48 (1,41) <0.001 0.04 (1,41) 0.85 5.85 (1,38) 0.02 0.03 (1,35) 0.87 Condition× Group 4.75 (1,41) 0.03 6.40 (1,41) 0.01 0.56 (1,38) 0.46 0.18 (1,35) 0.67 ROI× Group 3.42 (1.9,79.1) 0.04 1.18 (1.7,68.4) 0.30 0.73 (1.7,64.6) 0.46 3.92 (1.9,66.1) 0.03 Hemisphere× Group 0.009 (1,41) 0.92 0.01 (1,41) 0.93 0.46 (1,38) 0.50 0.30 (1,35) 0.59 Significant results are indicated in bold. Full results for each ANOVA can be found in OSF: https://osf.io/dt827/. 3706 | BRAIN 2022: 145; 3698–3710 B. Manini et al. Table 2 Multiple linear regression predicting behavioural performance in the switching task SwitchCost RT Model summary 2 2 Model R Adjusted R FP 1 0.46 0.36 4.78 0.01 2 0.41 0.34 6.15 0.009 Coefficients Model Unstandardized SE Standardized tP 1 (Intercept) 0.03 0.04 0.63 0.53 Language score −0.06 0.05 −0.23 −1.27 0.22 LH neural switch cost −0.02 0.01 −0.69 −2.35 0.03 RH neural switch cost 0.04 0.01 1.05 3.60 0.002 2 (Intercept) −0.01 0.03 −0.51 0.62 LH neural switch cost −0.02 0.01 −0.61 −2.09 0.05 RH neural switch cost 0.04 0.01 0.99 3.39 0.003 SwitchCost ACC Model summary Model R² Adjusted R² FP 1 0.28 0.16 2.26 0.12 2 0.28 0.20 3.55 0.05 3 0.21 0.16 4.96 0.04 Coefficients Model Unstandardized SE Standardized tP 1 (Intercept) 15.84 4.82 3.28 0.004 Language score −12.85 5.83 −0.46 −2.20 0.04 LH neural switch cost −0.28 1.25 −0.08 −0.22 0.82 RH neural switch cost 1.41 1.41 0.33 1.00 0.33 2 (Intercept) 15.78 4.69 3.37 0.003 Language score −12.57 5.55 −0.45 −2.27 0.04 RH neural switch cost 1.16 0.84 0.27 1.38 0.18 3 (Intercept) 18.90 4.20 4.50 <0.001 Language score −12.64 5.68 −0.45 −2.23 0.04 Significant results are indicated in bold. LH= left hemisphere; RH= right hemisphere; SE= standard error. regions are the first cortical relay of auditory inputs and have stron- associative region involved in reorientation of attention to ger subcortical inputs from the thalamus, while secondary re- task-relevant information, such as contextual cues or target stim- 55,56 gions might be more likely to be modulated by top-down uli. Regions of the right middle temporal gyrus have also been influences, potentially driving plastic reorganization in different di- shown to be involved in task switching and to encode task-set re- rections. Further studies focusing on finer-grain mapping of cross- presentations. In the absence of auditory inputs throughout de- modal plasticity effects in the auditory cortex of deaf individuals velopment, the proximity to the TPJ and the middle temporal are needed to elucidate these processes. gyrus may result in changes in the microcircuitry or in the compu- Task switching requires cognitive flexibility and shifting be- tations performed by the adjacent auditory cortices, where these 51,52 tween different sets of rules. Shifting is considered one of the regions now perform computations that allow switching between 7,54,58 core components of executive control. It is defined as the ability tasks. This is particularly relevant for the right hemisphere, to flexibly shift ‘back and forth between multiple tasks, operations where activity in auditory regions was more strongly linked to be- or mental sets’. Shifting is also an important component of work- havioural outcomes in the switching task in the group of deaf ing memory tasks previously shown to recruit posterior superior individuals. temporal regions in deaf individuals (e.g. two-back working mem- Another possibility is that the recruitment of ‘auditory’ tem- 4,5 ory, visuospatial delayed recognition ). In the present study, the poral regions for switching observed in deaf adults reflects vestigial working memory task did not significantly activate any temporal functional organization present in early stages of development. ROIs. The working memory task used in this study requires updat- Research on hearing children has found activations in bilateral oc- ing of information and incremental storage, but no shifting be- cipital and superior temporal cortices during task switching, with tween targets or internal representations of stimuli, as required a similar anatomical distribution to the one we find here. Our find- in an n-back task. Together, these results suggest that previous ings in deaf individuals suggest that executive processing in tem- working memory effects in superior temporal regions are not ne- poral cortices could be ‘displaced’ by persistent auditory inputs cessarily linked to storage, updating or control, but are more likely that, as the individual develops, may require more refined process- linked to shifting between tasks or mental states. ing or demanding computations. Thus, an alternative view is that A change of function in the auditory cortex, specifically in the regions considered to be ‘sensory’ have mixed functions in infants right hemisphere, could be explained by the anatomical proximity and become more specialized in adults. These regions could follow to the middle temporal lobe or to the parietal lobe, specifically the different developmental pathways influenced by environmental 7,54 temporoparietal junction (TPJ). Right TPJ is a multisensory sensory experience. As such, the temporal regions of hearing Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3707 Figure 5 ROI results from the working memory (A), planning (B) and inhibition (C) tasks. Ctr= control; WM= working memory; ToL= Tower of London; Con= congruent; Inc= incongruent. individuals will become progressively more specialized for sound rather, by language proficiency. Executive performance has been processing, whereas, in deaf individuals, they will become more previously associated with language proficiency in deaf chil- 47,48,62–64 specialized for subcomponents of executive processing. dren. While in our study language z-scores predict only The direct relationship between behavioural outcomes and ac- 20.7% of the variance in SwitchCost and the model was only sig- ACC tivity in reorganized cortical areas is robust evidence of the func- nificant at P < 0.05, our findings suggest that language development tional importance of the observed crossmodal plasticity. We can have long-lasting effects on executive processing throughout found that neural activity, specifically in the right temporal ROI, the lifespan. Different theories propose that language can provide predicted reaction times in the switching task in the deaf group. the necessary framework for higher-order (if–if–then) rules to de- Specifically, higher neural switch cost was linked to higher reaction velop and be used in a dynamic task in the most efficient 65,66 time switch cost (SwitchCost ), which suggests effortful process- way. These hierarchical ‘if–then’ rules could be implemented, RT ing, as previously described in other cognitive tasks with different in an automatic way, to solve the arbitrary link between stimulus 60,61 levels of complexity. It is important to highlight that there and response during switching. Although participants are not re- were no differences in SwitchCost between the groups, showing quired to use linguistic strategies during switching, we speculate RT that the potential reliance on different neural substrates to solve that those who have benefited from the efficiency associated with the switching task does not translate into differences in perform- developing such frameworks can invest fewer cognitive resources ance. In fact, significant interactions between group and condition into solving this task. While the role of language in executive pro- for the switching task were only found in accuracy (SwitchCost ), cessing needs further investigation, it is important to consider ACC which in our analysis was not predicted by neural activity, but that the timely development of a first language may boost the 3708 | BRAIN 2022: 145; 3698–3710 B. Manini et al. overall efficiency of a cognitive task, in this case switching, regard- Funding less of whether the task itself allows implementation of purely lin- This work was funded by a grant from the Biotechnology and guistic mechanisms. Biological Sciences Research Council (BBSRC; BB/P019994). V.V. is It is important to take into account that all signers of BSL are bilin- funded by a scholarship from the University of East Anglia . gual to a greater or lesser degree, depending on their early language background, degrees of deafness and educational experiences. Bilinguals who frequently change languages have generally been Competing interests shown to have an advantage in executive function switching 68–70 tasks. However, it is unlikely that differences in bilingualism The authors report no competing interests. can explain our findings in this study. If different results between deaf and hearing participants were due to the presence or not of bi- lingualism, we would have expected the group of deaf individuals to Supplementary material have a behavioural advantage in the switching task, but that was the Supplementary material is available at Brain online. opposite of what we found. In addition, we have previously shown that working memory responses in the superior temporal cortex of deaf individuals cannot be explained by bilingualism. In our previ- References ous study, we compared deaf native signers to two groups of hearing individuals: (i) hearing native signers, who were bilingual in English 1. D’Esposito M, Grossman M. The physiological basis of executive and BSL (bimodal bilinguals); and (ii) hearing non-signers who were function and working memory. The Neuroscientist. 1996;2:345– bilingual in English and another spoken language (unimodal bilin- 352. 2. Andin J, Holmer E, Schönström K, Rudner M. Working memory guals). These three populations were comparably proficient in both for signs with poor visual resolution: fMRI evidence of reorgan- their languages. We found differences in the recruitment of superior ization of auditory cortex in deaf signers. Cereb Cortex. 2021;31: temporal regions between deaf individuals and both groups of hear- 3165–3176. ing participants during a working memory task, suggesting a cross- 3. Buchsbaum B, Pickell B, Love T, Hatrak M, Bellugi U, Hickok G. modal plasticity effect driven by different sensory experience. Neural substrates for verbal working memory in deaf signers: These effects in the superior temporal cortex could not be explained fMRI study and lesion case report. Brain Language. 2005;95:265– by bilingualism, because this was controlled across groups. In the present study, significant activations during the switching condition 4. Cardin V, Rudner M, de Oliveira RF, et al. The organization of were found in the same areas where we previously found working working memory networks is shaped by early sensory experi- memory activations in deaf individuals (left and right pSTC, which ence. Cerebral Cortex. 2018;28:3540–3554. were defined functionally based on our previous findings; see the 5. Ding H, Qin W, Liang M, et al. Cross-modal activation of auditory ‘Materials and methods’ section), suggesting that these regions are regions during visuo-spatial working memory in early deafness. involved in specific subcomponents of executive processing as a con- Brain. 2015;138:2750–2765. sequence of early deafness. 6. Bedny M. Evidence from blindness for a cognitively pluripotent In addition, as a group, deaf participants had significantly longer re- cortex. Trends Cogn Sci. 2017;21:637–648. action times in all tasks. This is at odds with behavioural results from 7. Cardin V, Grin K, Vinogradova V, Manini B. Crossmodal reorgan- studies of deaf native signers, where the performance of this group in isation in deafness: mechanisms for functional preservation executive function tasks is comparable to or faster than that of typic- and functional change. Neurosci Biobehav Rev. 2020;113:227–237. 46 48 ally hearing individuals (e.g. Hauser et al., Marshall et al., and 8. Bola Ł, Zimmermann M, Mostowski P, et al. Task-specific re- Cardin et al. ). Native signers achieve language development mile- organization of the auditory cortex in deaf humans. Proc Natl stones at the same rate as that of hearing individuals learning a spoken Acad Sci USA. 2017;114:E600–E609. 9. Corina DP, Blau S, LaMarr T, Lawyer LA, Coffey-Corina S. language, highlighting again the importance of early language access, Auditory and visual electrophysiology of deaf children with not only for communication but also for executive processing. Deaf in- cochlear implants: Implications for cross-modal plasticity. dividuals also have faster reaction times in studies of visual reactiv- 21,71 Front Psychol. 2017;8:59. ity, suggesting critical differences in performance between purely 10. Simon M, Campbell E, Genest F, MacLean MW, Champoux F, perceptual tasks, and those which weigh more strongly on executive Lepore F. The impact of early deafness on brain plasticity: A sys- demands, where language experience and early language acquisition tematic review of the white and gray matter changes. Front could have a longer-lasting effect throughout the lifespan. Neurosci. 2020;14:206. In conclusion, we show that components of executive process- 11. Lomber SG, Meredith MA, Kral A. Cross-modal plasticity in spe- ing, such as switching, can be influenced by early sensory experi- cific auditory cortices underlies visual compensations in the ence. Our results suggest that, in the absence of auditory inputs, deaf. Nat Neurosci. 2010;13:1421. superior temporal regions can take on functions other than sensory 12. Twomey T, Waters D, Price CJ, Evans S, MacSweeney M. How processing. This could be either by preserving a function these auditory experience differentially influences the function of areas performed early in childhood or by taking on new functions left and right superior temporal cortices. J Neurosci. 2017;37: driven by influences from top-down projections from frontoparie- 9564–9573. tal areas or adjacent temporal and parietal regions. 13. Cardin V, Campbell R, MacSweeney M, Holmer E, Rönnberg J, Rudner M. Neurobiological insights from the study of deafness and sign language. In: Morgan G, ed. Understanding deafness, lan- guage and cognitive development. Essays in Honour of Bencie Woll. Acknowledgements Vol. 25. John Benjamins Publishing Company; 2020:159–181. The authors would like to specially thank all the deaf and hearing 14. Frasnelli J, Collignon O, Voss P, Lepore F. Crossmodal plasticity participants who took part in this study. in sensory loss. Progr Brain Res. 2011;191:233–249. Crossmodal plasticity in deaf adults BRAIN 2022: 145; 3698–3710 | 3709 15. Heimler B, Striem-Amit E, Amedi A. Origins of task-specific 36. Penny WD, Friston KJ, Ashburner JT, Kiebel SJ, Nichols TE. sensory-independent organization in the visual and auditory Statistical parametric mapping: The analysis of functional brain brain: Neuroscience evidence, open questions and clinical im- images. Academic Press; 2011. plications. Curr Opin Neurobiol. 2015;35:169–177. 37. Fischl B. FreeSurfer. NeuroImage. 2012;62:774–781. 16. Kral A. Unimodal and cross-modal plasticity in the ‘deaf’ audi- 38. Fischl B, Salat DH, Busa E, et al. Whole brain segmentation: tory cortex. Int J Audiol. 2007;46:479–493. Automated labeling of neuroanatomical structures in the hu- 17. Merabet LB, Pascual-Leone A. Neural reorganization following man brain. Neuron. 2002;33:341–355. sensory loss: the opportunity of change. Nat Rev Neurosci. 39. Dale AM, Fischl B, Sereno MI. Cortical surface-based analysis: 2010;11:44–52. I. Segmentation and surface reconstruction. NeuroImage. 1999; 18. Ricciardi E, Bottari D, Ptito M, Roder B, Pietrini P. The sensory- 9:179–194. deprived brain as a unique tool to understand brain develop- 40. Brett M, Anton J-L, Valabregue R, Poline J-B. Region of interest ment and function. Neurosci Biobehav Rev. 2020;108:78–82. analysis using the MarsBar toolbox for SPM 99. NeuroImage. 19. Land R, Baumhoff P, Tillein J, Lomber SG, Hubka P, Kral A. 2002;16:S497. Cross-modal plasticity in higher-order auditory cortex of con- 41. JASP Team. JASP (Version 0.14.1) [Computer software]. genitally deaf cats does not limit auditory responsiveness to Published online 2020. cochlear implants. J Neurosci. 2016;36:6175–6185. 42. Cormier K, Schembri A, Vinson D, Orfanidou E. First language 20. Bottari D, Caclin A, Giard MH, Pavani F. Changes in early cortical acquisition differs from second language acquisition in prelin- visual processing predict enhanced reactivity in deaf indivi- gually deaf signers: Evidence from sensitivity to grammaticality duals. PloS ONE. 2011;6:e25607. judgement in British Sign Language. Cognition. 2012;124:50–65. 21. Pavani F, Bottari D. Visual abilities in individuals with profound 43. Linebarger MC, Schwartz MF, Saffran EM. Sensitivity to gram- deafness: A critical review. In: Murray MM, Wallace MT, eds. The matical structure in so-called agrammatic aphasics. Cognition. neural bases of multisensory processes. CRC Press/Taylor & Francis; 1983;13:361–392. 2012:421–446. 44. Boudreault P, Mayberry RI. Grammatical processing in 22. Wechsler D. WASI: Wechsler abbreviated scale of intelligence. The American Sign Language: Age of first-language acquisition ef- Psychological Corporation; 1999. fects in relation to syntactic structure. Language Cogn Process. 23. Corsi P. Memory and the medial temporal region of the 2006;21:608–635. brain. Unpublished doctoral dissertation. McGill University; 1972. 45. Emmorey K. Language, cognition, and the brain: Insights from sign 24. Mueller ST, Piper BJ. The psychology experiment building lan- language research. Lawrence Erlbaum Associates; 2002. guage (PEBL) and PEBL test battery. J Neurosci Methods. 2014; 46. Hauser PC, Lukomski J, Hillman T. Development of deaf and 222:250–259. hard-of-hearing students’ executive function. Deaf Cogn: Found 25. Rubinstein JS, Meyer DE, Evans JE. Executive control of cognitive Outcomes. 2008;286:308. processes in task switching. J Exp Psychol: Hum Percept Perform. 47. Botting N, Jones A, Marshall C, Denmark T, Atkinson J, Morgan G. 2001;27:763–797. Nonverbal executive function is mediated by language: A study of 26. Rushworth MFS, Hadland KA, Paus T, Sipila PK. Role of the hu- deaf and hearing children. Child Develop. 2017;88:1689–1700. man medial frontal cortex in task switching: A combined fMRI 48. Marshall C, Jones A, Denmark T, et al. Deaf children’s non-verbal and TMS study. J Neurophysiol. 2002;87:2577–2592. working memory is impacted by their language experience. 27. Fedorenko E, Behr MK, Kanwisher N. Functional specificity for Front Psychol. 2015;6:527. high-level linguistic processing in the human brain. Proc Natl 49. Zatorre R, Belin P, Penhune VB. Structure and function of audi- Acad Sci USA. 2011;108:16428–16433. tory cortex: Music and speech. Trends Cogn Sci. 2002;6:37–46. 28. Fedorenko E, Duncan J, Kanwisher N. Broad domain generality 50. Kaas JH, Hackett TA, Tramo MJ. Auditory processing in primate in focal regions of frontal and parietal cortex. Proc Natl Acad cerebral cortex. Curr Opin Neurobiol. 1999;9:164–170. Sci USA. 2013;110:16616–16621. 51. Monsell S. Task switching. Trends Cogn Sci. 2003;7:134–140. 29. Morris RG, Ahmed S, Syed GM, Toone BK. Neural correlates of 52. Ravizza SM, Carter CS. Shifting set about task switching: planning ability: Frontal lobe activation during the Tower of Behavioral and neural evidence for distinct forms of cognitive London test. Neuropsychologia. 1993;31:1367–1378. flexibility. Neuropsychologia. 2008;46:2924–2935. 30. van den Heuvel OA, Groenewegen HJ, Barkhof F, Lazeron RHC, 53. Miyake A, Friedman NP, Emerson MJ, Witzki AH, Howerter A, van Dyck R, Veltman DJ. Frontostriatal system in planning com- Wager TD. The unity and diversity of executive functions and plexity: A parametric functional magnetic resonance version of their contributions to complex “frontal lobe” tasks: A latent Tower of London task. NeuroImage. 2003;18:367–374. variable analysis. Cogn Psychol. 2000;41:49–100. 31. Kelly A, Milham M. Simon task. Standford Digital Repository. 54. Shiell MM, Champoux F, Zatorre RJ. The right hemisphere pla- Available at: http://purl.stanford.edu/zs514nn4996 and https:// num temporale supports enhanced visual motion detection openfmri.org/dataset/ds000101/. Published online 2016. ability in deaf people: Evidence from cortical thickness. Neural 32. Peirce JW. PsychoPy—Psychophysics software in Python. J Plasticity. 2016;2016:7217630. Neurosci Methods. 2007;162:8–13. 55. Corbetta M, Shulman GL. Control of goal-directed and stimulus- 33. Fessler JA, Lee S, Olafsson VT, Shi HR, Noll DC. Toeplitz-based driven attention in the brain. Nat Rev Neurosci. 2002;3:201–215. iterative image reconstruction for MRI with correction for mag- 56. Geng JJ, Mangun GR. Right temporoparietal junction activation netic field inhomogeneity. IEEE Trans Signal Process. 2005;53: by a salient contextual cue facilitates target discrimination. 3393–3402. NeuroImage. 2011;54:594–601. 34. Funai AK, Fessler JA, Yeo DTB, Olafsson VT, Noll DC. Regularized 57. Lemire-Rodger S, Lam J, Viviano JD, Stevens WD, Spreng RN, field map estimation in MRI. IEEE Trans Med Imaging. 2008;27: Turner GR.Inhibit,switch, andupdate:A within-subject fMRI in- 1484–1494. vestigation of executive control. Neuropsychologia. 2019;132:107134. 35. Jezzard P, Balaban RS. Correction for geometric distortion in 58. Qiao L, Zhang L, Chen A, Egner T. Dynamic trial-by-trial recoding echo planar images from B0 field variations. Magn Reson Med. of task-set representations in the frontoparietal cortex mediates 1995;34:65–73. behavioral flexibility. J Neurosci. 2017;37:11037–11050. 3710 | BRAIN 2022: 145; 3698–3710 B. Manini et al. 59. Engelhardt LE, Harden KP, Tucker-Drob EM, Church JA. The 65. Pellicano E. The development of executive function in autism. neural architecture of executive functions is established by Autism Res Treat. 2012; 2012:146132. middle childhood. NeuroImage. 2019;185:479–489. 66. Zelazo PD, Müller U, Frye D, et al. The development of executive 60. Cazalis F, Valabregue R, Pélégrini-Issac M, Asloun S, Robbins function in early childhood. Monogr Soc Res Child Dev. 2003;68:vii-137. TW, Granon S. Individual differences in prefrontal cortical acti- 67. Freel BL, Clark MD, Anderson ML, Gilbert GL, Musyoka MM, vation on the Tower of London planning task: Implication for ef- Hauser PC. Deaf individuals’ bilingual abilities: American Sign fortful processing. Eur J Neurosci. 2003;17:2219–2225. Language proficiency, reading skills, and family characteristics. 61. Just MA, Carpenter PA, Keller TA, Eddy WF, Thulborn KR. Brain Psychology. 2011;2:18–23. activation modulated by sentence comprehension. Science. 68. Blanco-Elorrieta E, Pylkkänen L. Ecological validity in bilingual- 1996;274:114–116. ism research and the bilingual advantage. Trends Cogn Sci. 2018; 62. Figueras B, Edwards L, Langdon D. Executive function and lan- 22:1117–1126. guage in deaf children. J Deaf Stud Deaf Educ. 2008;13:362–377. 69. Prior A, Gollan TH. Good language-switchers are good task- 63. Hall ML, Eigsti IM, Bortfeld H, Lillo-Martin D. Executive function switchers: evidence from Spanish-English and Mandarin- in deaf children: Auditory access and language access. J Speech, English bilinguals. J Int Neuropsychol Soc. 2011;17:682–691. Language, Hearing Res. 2018;61:1970–1988. 70. Prior A, MacWhinney B. A bilingual advantage in task switching. 64. Hall ML, Eigsti IM, Bortfeld H, Lillo-Martin D. Auditory depriv- Bilingualism: Language Cogn. 2010;13:253–262. ation does not impair executive function, but language depriv- 71. Nava E, Bottari D, Zampini M, Pavani F. Visual temporal order ation might: Evidence from a parent-report measure in deaf judgment in profoundly deaf individuals. Exp Brain Res. 2008; native signing children. The J Deaf Stud Deaf Educ. 2017;22:9–21. 190:179–188.

Journal

BrainOxford University Press

Published: Jun 2, 2022

Keywords: deafness; executive function; auditory cortex

There are no references for this article.