You are on page 1of 16

Child Development, July/August 2009, Volume 80, Number 4, Pages 952967

Developmental Social Cognitive Neuroscience: Insights From Deafness


David Corina
University of California, Davis

Jenny Singleton
University of Illinois at Urbana-Champaign

The condition of deafness presents a developmental context that provides insight into the biological, cultural, and linguistic factors underlying the development of neural systems that impact social cognition. Studies of visual attention, behavioral regulation, language development, and face and human action perception are discussed. Visually based culture and language provides deaf children with affordances that promote resiliency and optimization in their development of visual engagement, executive functions, and theory of mind. These experiences promote neural adaptations permitting nuanced perception of classes of linguistic and emotional-social behaviors. Studies of deafness provide examples of how interactions and contributions of biological predispositions and genetic phenotypes with environmental and cultural factors including childhood experiences and actions of caregivers shape developmental trajectories (M. I. Posner & M. K. Rothbart, 2007).

The eld of social cognitive neuroscience has been instrumental in bridging formal psychological constructs derived from social psychology with emerging understanding of neurofunctional properties of the human brain. Growing evidence from this work supports the notion of specialized neural regions that are preferentially involved in the processing of social information. As social phenomena inherently cuts a broad path through human experience, ranging from understanding oneself to the appreciation of and interactions with others, it should not be surprising that the neural systems implicated in the mediation of social behaviors are multipurpose and highly intertwined with more basic sensory-perceptual, emotional, linguistic, and cognitive components (Beer & Ochsner, 2006; Lieberman, 2006). In the quest to understand these neural systems, recent work has begun to explore their development, striving to explicate the interactions and contributions of biological predispositions and genetic phenotypes with environmental and cultural factors including childhood experiences and actions of

This work is supported by National Science Foundation (NSF) Grant SBE-0541953 awarded to Gallaudet University and supports Drs. Corina and Singleton and National Institutes of HealthNational Institute on Deafness and Other Communication Disorders Grant 2ROI-DC03099 to D. Corina. We thank Mary C. Mendoza for her help in the preparation of this manuscript. Correspondence concerning this article should be addressed to David Corina, NSF Science of Learning Center for Visual Language and Visual Learning (VL2), Gallaudet University, Washington, DC 20002. Electronic mail may be sent to corina@ucdavis.edu.

caregivers that may shape developmental trajectories (Posner & Rothbart, 2007). Both adult and developmental psychological research literatures have beneted from understanding how early childhood experiences may alter the typical developmental time course of sociocognitive functions, as in cases of acquired brain damage (Adolphs, Tranel, & Damasio, 1998; Damasio, 1995; Koenigs et al., 2007), developmental disorders such as autism (Amaral, Schumann, & Nordahl, 2008; Bachevalier & Loveland, 2006; Baron-Cohen & Belmonte, 2005; Frith, 2001; Oberman & Ramachandran, 2007) and Williams syndrome (Bellugi, Lichtenberger, Jones, Lai, & St. George, 2000; Karmiloff-Smith, 2007; MeyerLindenberg et al., 2005) or language deprivation (Curtiss, 1977; Newport, 1990). These accidents of nature often put into relief seams within and between complex interacting systems, and thus provide clues to neural differentiation of processing systems as a function of early experience. In the present case, we consider aspects of social cognition within a relatively understudied domain, childhood deafness. As we hope to show, the context of deafness presents highly unique developmental situations that offer great potential for providing critical insights into the biological, cultural, and linguistic factors that underlie the development of neural systems that impact social

2009, Copyright the Author(s) Journal Compilation 2009, Society for Research in Child Development, Inc. All rights reserved. 0009-3920/2009/8004-0003

Insights From Deafness

953

cognition. While we cannot hope to cover all issues in an exhaustive fashion, we consider several pertinent examples in hopes that our modest efforts may encourage others to consider this unique human condition. The present review considers deafness within a broad perspective. We hope to demonstrate that insights regarding the development of social cognition may be gleaned not only from understanding the adaptations of a cognitive system in the face of sensory deprivation, but also from understanding deafness itself as a social experience, in which cultural and linguistic traditions are adaptive and provide deaf children with unique developmental niches that could perhaps contribute to differentiated processing. We hope to show how multifaceted conditions of deafness puts into relief differential contributions of sensory experience, cultural practices, and linguistic inuences that factor into social cognitive development. Our review will discuss emerging ndings from cognitive and neural studies that highlight three main areas of development (visual attention, high-level visual processing, and language and communication) that are known to contribute to social cognitive development (Grossman & Johnson, 2007). In these areas, deaf individuals have shown particular behavioral or neural organizational patterns that are distinguishable from their hearing counterparts. Most research in this area has focused on how factors of sensory and or linguistic deprivation have contributed to these differences. In the present review, we further consider how behavioral and neural changes may also be shaped by environmental experiences, such as the cultural and linguistic practices of families and communities. A review of these data highlights how dorsalparietal, frontal-executive, and temporal ventral cortical systems may be differentially sensitive to these factors. We begin with a brief overview of childhood deafness, American Sign Language (ASL), and the American Deaf community.

Deafness, ASL and Deaf Culture Acquired deafness may result from infection or drugs administered during pregnancy. Hereditary deafness is associated with over 350 genetic conditions (Martini, Mazzoli, & Kimberling, 1997), and about one third of these genetic conditions are associated with syndromes (Petit, 1996). The prevalence of hearing loss (from mild to profound) is approxi-

mately 9% of the population (Ries, 1994; this gure includes those who lose their hearing due to advancing age). The prevalence of profound congenital deafness is in the range of 0.51 per 1,000 births and these infants experience reduced or no access to the spoken language used by their family. An even smaller percentage of these deaf individuals are born to deaf parents and acquire a signed language, such as ASL, as their native language (Mitchell & Karchmer, 2004). By some estimates, 96% (Mitchell & Karchmer, 2004) of deaf children are raised in households with parents who are hearing. There is continuing controversy over the best strategies to promote linguistic competencies in children who are unable to process spoken language as efciently as their normally hearing counterparts. Parents must weigh a wide range of factors in their decisions in the face of often conicting information from medical, social services, educators, and community advocates. While technological advances in assistive hearing devices, including digital hearing aids and cochlear implantations may provide improved access to auditory information, linguistic development in deaf children raised in hearing households is often compromised. Language decits are often present regardless of whether such children are learning ASL (Schick & Hoffmeister, 2001; Strong & Prinz, 1997), English-based signing (Geers, Moog, & Schick, 1984; Schick & Moeller, 1992) or spoken English (Geers et al., 1984; de Villiers, 2003). In contrast, deaf children in deaf signing families, who are exposed to sign language from birth, acquire sign language just as hearing children effortlessly acquire spoken languages, along a similar developmental time course, and inuenced by linguistic complexity and cognitive development (Lillo-Martin, 1999; Newport & Meier, 1985). As we will discuss, the linguistic and sociocultural context of deaf children raised by deaf caregivers (in comparison to those raised by hearing caregivers) has important implications for social cognitive development. ASL is an indigenous language that evolved within North America and is used by members who self-identify with the Deaf community and who are part of Deaf Culture (Ladd, 2003; Padden & Humphries, 2005). The capitalization of the word deaf in this context serves to signify recognition of deaf individuals as a distinct cultural group, who are considered a minority community and whose primary language is signed (for discussions, see Lane, 1984; Padden & Humphries, 1988).

954

Corina and Singleton

With a structure distinct from English, ASL is a fully complex, natural human language. Although ASL is only one of many autonomous sign languages of the world, it is the one that has been studied most extensively. Its linguistic history involves a creolization of Old French Sign Language with local signed languages and homesign forms that existed in the United States in the early 1800s (Woodward, 1976). In the 1960s, linguists began to document the distinct grammatical structure of ASL (Stokoe, 1960 2005). Since then, the eld of signed language linguistics has prospered, including studies of other signed languages of the world. This linguistic research has demonstrated patterns that are shared with spoken languages and revealed unique linguistic structures that capitalize on the visualgestural modality (for recent discussions, see Emmorey, 2002; Sandler & Lillo-Martin, 2006). With the emergence of Deaf Studies and the legitimatization of ASL as a U.S. minority language in its own right, with an estimated number of 100,000300,000 users (Padden & Humphries, 2005), the language and culture of the Deaf community have thrived and gained increased attention from multiple research disciplines (e.g., anthropology, psychology, and education).

Visual Attention: Biological and Social Implications There is a common assumption that decits in a sensory modality provide increased sensitivities in remaining modalities: in the case of individuals with early blindness, for example, that their hearing is superior to that of sighted individuals. Similarly, in the case of early profound deafness, enhancements in visual processing could be presumed. However, careful research in this domain has demonstrated that primary sensory abilities such as auditory detection thresholds, intensity discrimination (Niemeyer & Starlinger, 1981), and auditory gap detection (Weaver & Stevens, 2006) in the case of the blind, and brightness discrimination (Bross, 1979), visual contrast sensitivity (Finney & Dobkins, 2001), temporal discrimination thresholds (Mills, 1985), temporal resolution (Bross & Sauerwein, 1980; Poizner & Tallal, 1987), and discrimination thresholds for motion direction (Bosworth & Dobkins, 1999) in the case of the deaf, are not signicantly altered as a result of loss of a competing modality (see Bavelier & Neville, 2002; Niemeyer & Starlinger, 1981, for discussion). Yet,

the evidence does indicate that for higher level processing tasks, compensatory differences may be observed in restricted domains. Research indicates specic enhancements for processing of the visual periphery and the detection of movement in peripheral vision in the deaf (Bavelier et al., 2001; Neville & Lawson, 1987a, 1987b, 1987c; Parasnis & Samar, 1985; Proksch & Bavelier, 2002). There is growing consensus that the visual skills for which deaf individuals exhibit heightened performance (compared with hearing individuals) appear under specic conditions, namely, those that engage spatial attention (for recent views, see Bavelier, Dye, & Hauser, 2006). Moreover, these studies suggest that it is auditory deprivation, rather than sign language experience per se, that brings about these observed attention differences (Bavelier et al., 2001; Bosworth & Dobkins, 2002; Neville & Lawson, 1987a, 1987b, 1987c; Proksch & Bavelier, 2002). Specically, several studies have examined the performance of deaf native signers to hearing native signers. This latter group is composed of normally hearing individuals raised in deaf signing households. The comparison of these groups provides an opportunity to systematically examine the contributions of language experience (shared between these two groups) from auditory experience (which differs between these groups). In these studies, it was only deaf subjects who showed evidence of enhancement, suggesting that the lack of auditory input may be more important than early sign language experience in altering spatial attention. Neuroimaging and electrophysiology data indicate functional changes in dorsal visual pathways among deaf subjects in response to visual attention tasks (Armstrong, Neville, Hillyard, & Mitchell, 2002; Bavelier et al., 2000, 2001; Neville & Lawson, 1987a, 1987b, 1987c). These data, coupled with animal studies of sensory deprivation, provide a compelling case for changes to dorsal visual processing streams as a function of auditory deprivation (Bavelier & Neville, 2002; Bavelier et al., 2006; Neville & Bavelier, 2002). Differences in visual attention that develop during a deaf persons childhood may impact aspects of social interaction. There is a controversial record of research that has suggested deafness is associated with social-emotional disturbances. However, many of these early studies of deaf childrens behavior were often awed by a failure to appreciate the heterogeneity within the deaf population being tested. A common claim is that deaf childrens behavioral problems are related to impulsivity and distractibility. Reivich and Rothrock (1972)

Insights From Deafness

955

suggested that impulsivity and lack of inhibition accounted for a signicant amount of the problem behavior in deaf pupils reported by teachers. As reported in Dye, Hauser, and Bavelier (2008), mothers rate deaf children as having greater distractibility-hyperactivity problems than hearing children (Quittner, Glueckauf, & Jackson, 1990), while other studies suggest little evidence of deafhearing differences in attention span (Meadow, 1976). More recently, researchers have used manipulations of continuous performance tasks (CPT) with deaf subjects to differentially assess variables such as impulsivity, persistence, and distractibility (Mitchell & Quittner, 1996; Quittner, Smith, Osberger, Mitchell, & Katz, 1994; Smith, Quittner, Osberger, & Miyamoto, 1998). According to these studies, deaf children have been found to exhibit more impulsivity and experience increased distractibility. In a different CPT study, Parasnis, Samar, and Berent (2003) reported that deaf college students with hearing parents had increased impulsivity when selecting the appropriate response accompanied by decreased perceptual sensitivity (they found it harder to distinguish between targets and nontargets) compared with hearing college students. Some have interpreted such differences to reect a decit in visual selective attention stemming from poor multimodal sensory integration as a result of early, profound hearing loss. Smith et al. (1998), for example, suggest that sound is an enabling condition for the development of selective attention. In this deciency hypothesis view (Smith et al., 1998), the integration of information from the different senses is proposed to be an essential part of the development of normal attention functioning within each individual sensory modality. Thus, the lack of access to sound results in underdeveloped selective attention capacities in deaf children. Hearing individuals can attend selectively to a narrow visual eld and still monitor the broader environment through sounds. This multimodal contact with the world permits an efcient division of labor. Deaf individuals, by contrast, must simultaneously use vision both to accomplish the specic tasks and to monitor the broader environment (Mitchell, 1996; Smith et al., 1998). An alternative interpretation of the behavior is offered by Parasnis et al. (2003) who argue that the apparent impulsivity exhibited by deaf subjects may be an adaptive strategy. In the absence of auditory input, deaf individuals must have greater reliance upon vision for being alerted. The decreased perceptual sensitivity on the CPT task,

they argue, results from redistribution of attention away from the center and toward peripheral vision. A similar reallocation view was proposed earlier by Neville and collaborators (Neville & Lawson, 1987a, 1987b, 1987c; Neville, Schmidt, & Kutas, 1983; see Dye et al., 2008, for a more extensive discussion of these issues). A different perspective is offered by Hauser, Lukomski, and Hillman (2008) who consider executive function (EF) skills rather than selective visual attention as a central factor in the reports of impulsivity and distractibility in the deaf children. They note that response inhibition is the prerequisite to self-regulation, which is instrumental for purposive intentional behavior (Barkley, 2001). Hauser et al. (2008) note that while several studies (e.g., Gioia, Isquith, Guy, & Kentworth, 2000; Roth, Isquith, & Gioia, 2005) have reported deaf and hearing differences on clinical tests and rating scales of EF such as the Behavior Rating Inventory of Executive Functions, other studies have failed to nd such differences (Everhart & Marschark, 1997). Studies comparing the impulsivity rates of deaf children born to deaf parents versus those born to hearing parents, found that deaf-of-deaf children have been rated more positively, indicating lower impulsivity rates (Harris, 1978; Hauser, Lukomski, & Isquith, 2009; Hauser, Wills, & Isquith, 2006; Oberg, 2007). Taken together, these studies reveal an intriguing interplay between brain adaptations resulting from altered sensory experience and potential behavioral sequelae that, in particular contexts, maybe considered maladaptive. Reports of increased impulsivity and distractibility have been considered in relation to visual attention and EF. Researchers such as Smith et al. (1998) view the lack of hearing as a causal factor in resulting in decit selective attention processes, whereas Parasnis et al. (2003) suggest the behavior is not decient per se but rather an adaptive response in the face of increased reliance on visual information monitoring. In this view, there is no explicit claim regarding the role of audition in the development of selective visual attention. Instead, these researchers attribute the behavior to adaptations in the distribution of attention and focus on functional-anatomical accounts that implicate parietal attention networks in this population (Dye, Baril, & Bavelier, 2007; Dye et al., 2008; Parasnis et al., 2003). Alternatively, Hauser et al. (2008) implicate differences in EF, which may manifest in terms of differences in response inhibition. Functional-anatomical models which implicate the dorsolateral prefrontal area in EF and working memory and the ventromedial

956

Corina and Singleton

prefrontal areas with emotional and social decision making (Anderson, 1998; Eslinger & Grattan, 1991; MacPherson, Phillips, & Della Sela, 2002) factor signicantly in this account. At this time, it is difcult to adjudicate between these accounts. One notes, for example, that the measures that have been used to establish these effects range widely in methodology and ecological validity, from well-controlled psychophysical measures with highly constrained visual stimuli to subjective rating scales of human behaviors. While researchers are increasingly cognizant of the heterogeneity of deafness and the importance of evaluating these variables in the interpretation of results, further attention to this issue is greatly needed. Additional work is needed to explicate the causal and adaptive factors that may underlie the reported behavioral differences in impulsivity, distractibility, and self-regulation of this population. One factor to be considered is a deaf childs early socialization experiences.These practices may build resilience to the possible effects of sound deprivation. The development of visual attention and gaze following is seen as an important milestone for all children (Brooks & Meltzoff, 2008; Meltzoff & Brooks, 2009). Brooks and Meltzoff (2002) explain e into the psychologthat gaze following is an entre ical world in which things are important not only because of their physical properties, but because they are referred to by others (p. 965). In hearing caregiverchild dyads, the infant follows the gaze of their caregiver and subsequently engages in joint visual attention toward an object while their caregiver simultaneously produces a linguistic label (or an emotional reaction) that they can hear while visually exploring the object. Young infants, below 12 months, may use head turning by the caregiver as the primary directional cue to elicit directional attention, but by 1218 months, infants grant special status to human eyes accompanied by head movement and treat the adults gaze as object directed (Brooks & Meltzoff, 2002, p. 965). The processes that begin with joint visual attention skills in infancy are thought to be important to the subsequent capacity for social learning throughout the lifespan (Vaughan van Hecke et al., 2007, p. 54). If one adopts this standard model for the development of joint visual attention, a deaf child could be seen as facing a tremendous perceptual and cognitive challenge. A deaf child needs to look at their caregiver to get linguistic input, but they also need to look at objects to learn about the world. Across a number of studies observing deaf caregiverdeaf infant dyads (based in several different

countries and using natural signed languages), a repertoire of visually responsive culturally embedded behaviors has been documented among caregivers (Erting, Prezioso, & Hynes, 1990; Harris, 2000; Koester, Brooks, & Traci, 2000; Waxman & Spencer, 1997). To summarize, deaf caregivers: (a) use visual or tactile signals to elicit their childs visual attention (through a hand-wave or physical touch), (b) displace their signing to be within the childs line of sight (as they look at the object), (c) use longer wait times before producing childdirected language (i.e., waiting until the child independently looks to the caregiver before starting to sign), and (d) use greater persistence to successfully gain and redirect the childs attention (see Loots & Devise, 2003, for a more detailed summary of deaf caregiverinfant dyad behaviors). Deaf caregivers intuitive communicative interaction strategies appear to socialize deaf infants and toddlers into coordinated and complex visual engagement. Indeed, Harris and Mohay (1997) and Koester (1995) observed that deaf infants of deaf caregivers consistently look to their caregivers, and their caregivers intuitively respond with appropriate linguistic and affective input. In contrast, the deaf infants of hearing caregivers in their studies demonstrated low rates of spontaneously looking to their caregivers. Loots, Devise, and Jacquet (2005) suggest that once the attention-switching strategy and sustained looking are developed, the deaf childs spontaneous and active use of visual attention becomes more important in initiating and continuing moments of intersubjectivity than the continuous use of visualtactile strategies (p. 372). Thus, the initial visualtactile scaffolding produced by caregivers appears to provide the needed supports for the development of self-regulation of visual attention (e.g., attention switching, monitoring visual access, and following caregiver head position and eyegaze direction). Moreover, observational studies of native-signing deaf preschoolers engaged in multiparty discourse within visually complex environments show that they can smoothly gaze-follow their teacher and correctly use discourse cues in ASL to anticipate visual turn taking (Crume & Singleton, 2008). Yet, in this classroom context, deaf teachers continued to use explicit visual and tactile strategies to support the development of visual attention self-regulation, presumably for deaf preschoolers with hearing parents who would not have experienced such caregiver strategies at home. The skills involved in following a caregivers direction of eyegaze and pointing gestures,

Insights From Deafness

957

and child-initiated coordinated attention such as pointing to or showing objects to the caregiver, are argued to involve aspects of the brains EF such as attention regulation, inhibitory control, and self-monitoring (Carlson & Meltzoff, 2008; Mundy & Acra, 2006). Thus, in the context of deaf childrens family and school settings, we would argue that deaf caregiver and teachers cultural and linguistic practices that structure visual attention may even enhance a deaf childs EFor at the very least builds in resilience, or protection from the possible disadvantages of sound deprivation. This view of the cultural structuring of developmental processes is consistent with work conducted in cross-cultural investigations of caregiverchild interaction. For example, Rogoff and her colleagues (Chavajay & Rogoff, 1999; Rogoff, Mistry, Goncu, & Moisier, 1993) discuss cultural variation in the ways that caregivers attend to their children and how they encourage their infants and children to attend to objects and events. For example, Chavajay and Rogoff (1999) observed that middle-class European American children had a tendency to engage in a kind of xed-capacity focus on single objects or events, and then alternated their attention to new objects or events. Their parents expressed pride when the child could sustain their attention on a singular point of focus. In contrast, Guatemalan Mayan toddlers and their caregivers seemed to have a cultural preference for simultaneously attending to multiple sources of focus. In discussing these cultural differences, Chavajay and Rogoff noted that the social worlds of the Mayan children included large families often in each others presence, maintaining proximity often by sitting in a semicircle, which could thereby increase the childs opportunity for exposure to competing attentional events and multiparty conversations. An important question to address, then, is whether particular neural systems (e.g., prefrontal systems) could be susceptible to modication as a result of such cultural practices and social interactions. In the simplest sense, these cultural contexts can be seen as contributing a framework for how adults structure the everyday experiences of a child, dening opportunities for a child to engage in complex, culturally meaningful interactions. Along similar lines, Zukow-Goldring and Arbib (2007) discuss how caregivers bracket ongoing actions with gestures that direct the infants attention to perceptual information embodied in action sequences (p. 2181). Their position is that this kind

of caregiverinfant interaction provides the basis for developing a shared understanding of events. For example, in their longitudinal study of 11 hearing caregivers (5 English-speaking Euro-American middle class and 6 Spanish-speaking Latino working class) and their hearing infants aged 6 26 months, Zukow-Goldring and Arbib found that when these caregivers accompanied their participation or imitation invitations to their infants with attention-directing gestures, the infants ability to engage or reach new understandings was increased. Therefore, given the importance of hearing infants development of eyegaze following for their cognitive and linguistic development and the apparent privileged role of caregivers attention-directing gestures in this process, we would expect that the highly demanding and gesturally complex context of visual language interaction (i.e., the experience of deaf children raised by deaf parents or having daily interaction with ASL-uent early childhood teachers) could promote early and advanced development of eyegaze following and attention shifting. Future studies are needed to determine whether this prediction would be supported. The fact that the development of visual attention can be differentially shaped across cultures suggests the possibility of an intimate relation between cultural practices and modications of the neural systems associated with social-attention and EF.

Higher Visual Processing: Facial Expressions and Human Actions The perception of facial expressions and the discernments of human actions serve a critical role in achieving successful social interaction. Research on these high-level visual processing domains has begun to shed new light into neural systems that lie at the interface between visual perception and social function. Studies of higher level visual processing skills have also reported differences between deaf and hearing individuals. These include studies of mental rotation, image generation, memory for spatial location and face processing (Bettger, Emmorey, McCullough, & Bellugi, 1997; Emmorey, Kosslyn, & Bellugi, 1993; McKee, 1987), and action recognition (Corina et al., 2007). In contrast to the effects of visual attention noted above, enhancements in these domains are linked to experience with signing rather than auditory deprivation per se.

958

Corina and Singleton

Understanding Faces In the context of social cognition, studies of facial expression recognition are of particular interest. The capacity to extract socially relevant information from faces is fundamental to normal reciprocal social interactions and interpersonal communication (Pelphrey, Singerman, Allison, & McCarthy, 2003). For deaf signers, facial expression serves dual roles; in addition to the universal emblems of emotionality (Ekman, 1992), a separate class of facial expressions serves to indicate linguistic contrasts in signed language, including languagespecic syntactic and morphological distinctions (Baker-Shenk & Cokely, 1980). Deaf and hearing people alike use the face to express emotion. However, ASL signers also use facial expression and changes in head and body position to convey linguistic contrasts (Liddell, 1980). The production of linguistic and emotional facial expression differ in timing and scope. Facial expressions that serve linguistic functions have a clear onset and offset, and follow the clausal and lexical properties of ASL grammar. In contrast, emotional expressions are more variable in execution, may differ in intensity and scope, and are not tied to the presence of specic linguistic forms. Researchers have reported that categorical processing of facial displays can be demonstrated for sign but may be grounded in universally perceived distinctions between communicative face actions (Campbell, Woll, Benson, & Wallace, 1999). In language acquisition studies, Reilly and her colleagues (Reilly & Bellugi, 1996; Reilly, McIntire, & Bellugi, 1990a, 1990b) have shown different timing between deaf childrens development of facial expressions for emotion and for linguistic contrasts. As with hearing children, emotion expressions are produced by deaf children consistently by 1 year of age. However, linguistic facial expressions show a protracted development, appearing only after the acquisition of manual linguistic signs. A striking example is the development of headshake as a negation marker in ASL grammar. The physical form of this linguistic behavior is identical to the common headshake that hearing and deaf children in the United States use to mean no (this often develops before the age of 2 years). In the adult form of ASL, negative manual signs such as no, not, or cant, must be accompanied with the linguistic negation marker: headshake. Interestingly, Reilly et al.s developmental studies report that deaf children rst begin to produce isolated headshakes in response to questions or to reject requests. But

when they rst acquire the manual signs for negation, they appear without a headshake (which is ungrammatical in ASL). The appearance of the correct accompanying headshake lags behind the appearance of the manual sign by 18 months. These ndings indicate a separation between acquisition of a social-cultural communicative behavior (headshake) and the acquisition of grammatical marker (obligatory linguistic headshake) to signal negation. In behavioral studies, judgments of linguistic and emotional facial expressions produced by uent adult signers indicate differential right and left sided expressivity, which has been argued to reect underlying neural control (Campbell, 1978; Corina, Bellugi, & Reilly, 1999). This nding is also corroborated by neuropsychological studies of facial expression production by left- and righthemisphere-damaged signers. Corina et al. (1999) reported a double dissociation in which a righthemisphere-damaged signer exhibited greatly reduced facial emotionality with well-preserved production of linguistic expressions; the second case study, of the left-hemisphere-damaged signer, showed the oppositeexuberant affective expressions with severely limited linguistic facial expression. Recent data from functional MRI (fMRI) have provided new evidence of differential specialization for facial expression recognition in deaf signers (McCullough, Emmorey, & Sereno, 2005) in response to static faces. Studies of deaf signers and hearing nonsigners have reported bilateral superior temporal sulcus (STS) activation for emotional facial expression in deaf subjects and right hemisphere dominance for hearing subjects. In addition, activation in the fusiform face area (FFA) was left lateralized for deaf signers for both emotional and linguistic forms of expression, whereas activation was bilateral for both expression types in the hearing nonsigning subjects. fMRI studies of facial information processing in nonsigners have sought to differentiate these functions within the posterior STS and FFA. Numerous studies report a role of bilateral STS in detection of eyegaze (Allison, Puce, & McCarthy, 2000; Puce, Allison, Bentin, Gore, & McCarthy, 1998). Recent work indicates that STS activity is modulated by the context within which eyegaze shifts occur, suggesting that this region is involved in social perception via its role in the analysis of the intentions of observed actions (Materna, Dicke, & Thier, 2008; Mosconi, Mack, McCarthy, & Pelphrey, 2005). It is interesting to note that group differences in STS activation between deaf and hearing subjects in the

Insights From Deafness

959

McCullough et al. (2005) study emerged only when the linguistic expressions were presented within the context of linguistic manual sign. This is consistent with the growing evidence that posterior STS regions involved in facial processing may serve interpretative functions. This may be an indication that this neural region (i.e., STS) in the deaf subjects is modulated by linguistic communicative intent. In contrast, the activation in the FFA, though hemispherically distinct from hearing subjects, was not modulated by the contextual manual sign cues. This is consistent with reports that indicate that FFA may serve more foundational roles in structural encoding of face specic information (Kanwisher, McDermott, & Chun, 1997; Kanwisher & Yovel, 2006; McCarthy, 1999). McCullough et al. suggest that hemispheric differences observed in these studies may reect perceptual differences that invoke greater local featural processing (as opposed to global-congural processing) in the deaf subjects. It is unknown whether there is a relation between the staggered production of affective and linguistic facial expressions in deaf signing children and the appearance of the differential hemispheric asymmetries for these classes of expression. What is noteworthy is that in these cases, the demands of language processing in the visual-gestural modality may be driving the differential specialization and reorganization within temporal lobe regions.

Understanding Human Actions Since the discovery of mirror neurons in Macaque primates (Gallese, Fadiga, Fogassi, & Rizzolatti, 1996; Rizzolatti, Fadiga, Gallese, & Fogassi, 1996), there has been renewed interest in the neural basis of human action understanding. A human mirror neuron system (hMNS) has been evoked as a neural basis for a wide range of social behaviors including imitation, empathy, and language, and by some accounts even provided a foundation for human culture (Gallese, 2007). Dysfunction within a human mirror neuron has been argued to underlie developmental decits such as autism (Iacoboni & Mazziotta, 2007; Oberman & Ramachandran, 2007; Williams, Whiten, Suddendorf, & Perrett, 2001). Data from functional neuroimaging have been used as evidence for a human homolog of a mirror neuron system characterized as a bilateral fronto-parietal network capable of representations of actionperception pairings (Miall, 2003). Anato-

mical regions that factor prominently in this network include ventral premotor cortex and inferior parietal lobule, the STS, and intraparietal sulcus. Thus, studies of deaf signers may provide an important way to help constrain the investigation of the broad neurofunctional scope of the hMNS. For example, to the extent that an hMNS underlies linguistic capacity, we might predict that deaf signers, whose language is expressed largely by manual actions would strongly engage the hMNS. In a recent review of the neuroimaging ndings in signers, Corina and Knapp (2006) evaluated whether frontal and parietal lobe bilaterally showed activations during the perception and production of signing as is predicted by an hMNS account of language. The ndings showed only partial support for this hypothesis and indicated that only left hemisphere regions satised the necessary condition of cojoint activation. In addition, left frontal activations were most commonly reported in Brocas area (BA 45) rather than the frontal-ventral regions (44 6) that are often associated with hMNS activations. These data thus suggest a full hMNS account of language must acknowledge the left hemisphere specialization for human languages and the possibility for regional shifts in anatomical representations. Studies of nonlinguistic human action processing provides further evidence that current hMNS accounts of action understanding are likely too simplistic. In a recent PET study, Corina et al. (2007) examined neural activation in response to the perception of object-oriented human action (e.g., drinking from a glass) and intransitive actions (e.g., rubbing your eyes) in deaf and hearing subjects. While hearing subjects evidenced a pattern of bilateral inferior-frontal and parietal activations commonly associated with hMNS, deaf subjects exhibited qualitatively different patterns of activation with prominent bilateral ventraloccipito-temporal activations. These human action processing data are important in the context of social cognition as they suggest that the hMNS may not be an inevitable processing system for human action understanding. Indeed, data from language and gesture studies in the deaf subjects suggest that the hMNS may be mutable and subject to reorganization. The engagement of the temporal-ventral region may reect active monitoring of featural properties of human actions that are necessary for differentiation of linguistic and nonlinguistic human action. Thus, as with facial expression processing (as discussed

960

Corina and Singleton

earlier), there is growing evidence for functional specialization of temporal-ventral systems that may underlie the careful analysis of human forms and actions. What is striking is that in each of these cases, face and human action recognition, deaf individuals are confronted with a constrained signaling medium (i.e., visually perceptible human manual and facial actions) that must take on additional functionality (to convey not only emotion and environmental interaction but language as well). Each of these cases reveals further specialization within temporal lobe and temporal-ventral regions, perhaps in response to this added visual signaling complexity. Hearing children bring both auditory and visual capacities in assimilating their environment, which, as previously noted, affords an efcient division of labor that is not available to the deaf child. Therefore, a hearing childs task does not require resolving competing linguistic and nonlinguistic input within the visual modality to the same degree as deaf, sign-exposed children must do. One might expect that in addition to developing a temporal-ventral processing system that might attune the deaf child to differentiate subtle visual signals, the deaf child must further develop frontal-executive control mechanisms to modulate responses to these signals. Taken together, one may interpret these competing modality interactions of language and nonlanguage input as an example of complex humanenvironment interaction that demands a particular kind of attentional and higher order processing that supports the unlayering or disentangling of perceptually similar, but functionally different, input. The result of this specialized processing may lead to enhanced parietalattentional and occipito-temporal functioning. Whether there are specialized frontal-executive functionalities that arise in concert with or in response to these processing demands is not yet known. A question of interest is whether deaf caregivers may be using communicative strategies to uniquely demarcate these complex layers, which has the effect of supporting the childs process of resolving this task. Whether there are strict timetables or critical periods for such reorganization is unknown (but see Newman, Bavelier, Corina, Jezzard, & Neville, 2002); however, a recent developmental study provides some insights. Krentz and Corina (2008) used a looking time paradigm to explore nonsigning hearing infants sensitivities for distinguishing sign language from complex pantomime. The moti-

vation for these studies is derived from longstanding ndings that hearing infants show attention preferences for speech over other forms of complex acoustic stimuli (Colombo & Bundy, 1981; Glenn, Cunningham, & Joyce, 1981; Vouloumanos & Werker, 2004). Previous speech-based studies have been taken as evidence for an innate bias to human language. However, it is unclear whether this looking preference reects a bias for speech or human language more generally. In this study, 6-month-old hearing infants looked longer at sign stimuli compared to complex pantomimes. Interestingly, the preference for sign language was not observed in 10-month-old hearing infants; in fact, there were no differences in looking times for pantomime or signs at this later age. This change in looking behavior between 6 and 10 months may reect hearing infants gradual loss of sensitivity to linguistic forms that are not encountered in their home environment. This result mirrors infants loss of detection of linguistic contrasts for spoken language (Best, McRoberts, & Sithole, 1988; Eimas, 1975; Werker & Tees, 1984). This attunement may have functional origins that extend beyond language per se: It is known that infants are close monitors of psychosocial content available from the eye, mouth, and head movements of adults (Meltzoff & Brooks, 2007). Thus, infants discrimination of linguistic from nonlinguistic motion may be an even more sophisticated instantiation of their on-line attempts to learn whether the actions of the viewed adult have emotional-communicative relevance (Knapp, Cho, & Corina, 2008). The speculation is that this timeframe may be a period in which infants are beginning to show a specialization for human actions that are linguistic in nature versus those which are not. Preliminary evidence suggests that deaf sign-exposed infants maintain their interest in signs over pantomime during this time (Klarman, Krentz, Brinkley, Corina, & Kuhl, 2008). The appreciation of facial expression and interpretation of human actions are of course critical to all social interactions and there is no evidence to indicate that deaf individuals possess differences in responsivity to human emotions or actions. In the present case, however, acquisition of a visual gestural language appears to place requirements for the development of specialized mechanisms for processing this information, which often is highly similar in form to nonlinguistic, but socially meaningful counterparts. We maintain that the brain is capable of accommodating these new functions and does so by engendering increased

Insights From Deafness

961

bilateral hemispheric participation and specialization within the inferior temporal lobes.

The Role of Language in Social Cognition: Deaf Children and Theory of Mind Studies Theory of mind (ToM) refers to the awareness of how mental states such as memories, beliefs, desires, and intentions govern the behaviors of self and others (Baron-Cohen, 2000). ToM is considered a cornerstone of social intelligence and satisfying social interaction and develops rapidly during the preschool years (Peterson, Wellman, & Liu, 2005). There is ongoing controversy as to the causal principles that underlie belief attribution and ToM (for a recent review of pertinent issues and evidence, see Saxe, Carey, & Kanwisher, 2004). The relation of language to the development of ToM is one of many prominent issues raised in these debates. Studies of deaf children have contributed signicantly to our understanding of the relation between language and ToM. As noted above, by some estimates, 96% (Mitchell & Karchmer, 2004) of deaf children are raised in households with parents who are hearing. For a child with auditory challenges, this often results in a lack of full access to a consistent adult language model and can lead to delays in language development at critical periods. The selective decits in language ability observed in some cases of deafness provide an opportunity to tease apart effects of cognitive maturation and engagement in social interaction at least to the extent that the latter do not themselves depend on language acquisition (Schick, de Villiers, de Villiers, & Hoffmeister, 2007). Studies of deaf children with hearing parents report signicant delays in the mastery of verbal and nonverbal ToM tasks (Marschark, 1993; Peterson & Siegal, 1995; J. de Villiers, 2005; de Villiers & de Villiers, 2000). The case for linguistic inuences as a factor comes from the comparisons of deaf children of hearing parents (DoH) with deaf children who are raised in households with deaf signing parents (DoD). DoD children acquire sign language naturally as a primary language and exhibit linguistic competencies on par with hearing children raised in spoken language environments. The DoD children do not evidence delays on ToM tasks (Schick et al., 2007). Additional studies have compared the performance of DoH to autistic children, who are also known to display decits on ToM tasks (Peterson et al., 2005). Deaf children with linguistic decits

nevertheless exhibit social responsiveness in contrast to autistic children. Furthermore, while development of ToM is developmentally delayed, the DoH children nevertheless advance through the eventual progression in their attainment of ToM. In contrast, autistic children demonstrate a different sequence of understanding in the later stages of the progression. Thus, it is argued that individuals with autism may have a distinctive, autism-specic difculty with the sort of mental state understanding that is needed for false-belief tasks above and beyond other sorts of mental state understanding (Peterson et al., 2005). Recent work by Meristo et al. (2007) has extended earlier investigations of the development of ToM in deaf children by comparing native signing children in Italy and Estonia in their respective signed languages as a function of school environment. In this study, deaf childrens communicative experiences in the classroom, native (DoD) signing deaf children attending bilingual-(sign and spoken language) based programs outperformed native signers attending oral programs that emphasized speech only. This pattern was observed in both the Italian and Estonian samples. In addition, these studies replicated the well-known nding of overall poorer performance of ToM tasks in deaf children from hearing homes in Italy, Estonia, and Sweden (Meristo et al., 2007). These ndings suggest that the expression of ToM in native signing deaf children may depend not only on early language experience but also on childrens continuing exposure to opportunities for monitoring conversational input about mental states. Opportunities for exposure to contexts in which discourse of mental causes is present is known to be correlated with the development of false belief (Cutting & Dunn, 1999). Data from deaf signing children have been used to argue that specic formal properties of language, for example, the understanding of syntactic complements (J. de Villiers, 2005; P. de Villiers, 2005), underlie the appearance of ToM abilities. Yet whether these specic constructions are causal is a matter of debate (see Saxe, 2006; Saxe et al., 2004). Cross-linguistic studies can provide a powerful test of the generality of such claims. As reported by Schick et al. (2007), deaf childrens understanding of syntactic complements in either ASL or English were shown to be independent predictors of success on verbal and low-verbal ToM tasks. The data from Meristo et al. (2007) reported above are surprising in this regard as they report differences in ToM for native deaf signing Italian and Estonian children. Unless we assume that the acquisition of

962

Corina and Singleton

these critical syntactic complement constructions is gradient in nature, it is difcult to reconcile these claims.

Conclusion In this review, we have examined how the human condition of deafness can provide avenues for understanding the development of neural systems that impact social cognitive processing. Our integrated approach echoes Posner and Rothbart (2000) who emphasize that developmental pathways for sociocognitive processes are inuenced by complex interaction effects of early temperament predispositions, socialization processes, relationships, and culture (p. 438). We have reviewed evidence that auditory deprivation may engender changes within dorsal-visual pathways that affect spatial-visual attention in a highly specic fashion. We explored claims that these attentional differences may impact aspects of social interaction, especially with respect to reports of distractibility and impulsivity of deaf individuals by some clinical and educational literatures. We contrasted functional-anatomical-based explanations of these data, which emphasized differential roles of parietal-attentional and frontalexecutive systems in the manifestation of these behaviors. We noted how exposure to a visually based culture and language may provide a deaf child with certain cognitive affordances that promote resiliency and optimization in their development of visual engagement, other EFs, and sociocognitive skills such as ToM. These studies raise important questions concerning the mechanisms by which culture and socialization may impact social cognitive development and provide clues to neural systems mediating these changes. We discussed how deafness and the acquisition of a visually based language may promote neural adaptations that permit the more nuanced perception required for distinguishing classes of social behaviors (whether linguistic or emotional). These data can inform debates regarding the role of hMNSs in the perception of human actions and language. These studies can provide insight into how these specialized adaptations are realized, in the present case, by engendering differential hemispheric specialization and engagement of STS and temporal ventral regions. In this review, while we have often discussed various factors and domains implicated in the developmental social cognition of deafness in isolation, we readily acknowledge the intricate

interdependencies that are present. We hope to have shown how the study of deafness and the cultural structuring of attention and visual engagement highlight the potential contributions of biological, environmental, and cultural factors, especially during childhood. This case reveals unique cultural, cognitive, and biological affordances that may enable deaf children to develop resiliency, and protect them from the potential negative consequences of sensory deprivation.

References
Adolphs, R., Tranel, D., & Damasio, A. R. (1998). The human amygdala in social judgment. Nature, 393(6684), 470474. Allison, T., Puce, A., & McCarthy, G. (2000). Social perception from visual cues: Role of the STS region. Trends in Cognitive Science, 4(7), 267278. Amaral, D. G., Schumann, C. M., & Nordahl, C. W. (2008). Neuroanatomy of autism. Trends in Neurosciences, 3, 137145. Anderson, V. (1998). Assessing executive functions in children: Biological, psychological, and developmental considerations. Neuropsychological Rehabilitation, 8(3), 319349. Armstrong, B. A., Neville, H. J., Hillyard, S. A., & Mitchell, T. V. (2002). Auditory deprivation affects processing of motion, but not color. Brain Research. Cognitive Brain Research, 14(3), 422434. Bachevalier, J., & Loveland, K. A. (2006). The orbitofrontal-amygdala circuit and self-regulation of social-emotional behavior in autism. Neuroscience and Biobehavioral Reviews, 30(1), 97117. Baker-Shenk, C., & Cokely, D. (1980). American Sign Language: A teachers resource text on grammar and culture. Washington, DC: Gallaudet University Press. Barkley, R. A. (2001). The executive functions and selfregulation: An evolutionary neuropsychological perspective. Neuropsychology Review, 11, 129. Baron-Cohen, S. (2000). Theory of mind and autism: A fteen year review. In S. Baron-Cohen, D. Cohen, & H. Tager-Flusberg (Eds.), Understanding other minds: Perspectives from developmental cognitive neuroscience (pp. 320). Oxford, UK: Oxford University Press. Baron-Cohen, S., & Belmonte, M. K. (2005). Autism: A window onto the development of the social and the analytic brain. Annual Review of Neuroscience, 28, 109126. Bavelier, D., Brozinsky, C., Tomann, A., Mitchell, T., Neville, H., & Liu, G. (2001). Impact of early deafness and early exposure to sign language on the cerebral organization for motion processing. Journal of Neuroscience, 21(22), 89318942. Bavelier, D., Dye, M. W., & Hauser, P. C. (2006). Do deaf individuals see better? Trends in Cognitive Science, 10(11), 512518.

Insights From Deafness Bavelier, D., & Neville, H. J. (2002). Cross-modal plasticity: Where and how? Nature Reviews. Neuroscience, 3(6), 443452. Bavelier, D., Tomann, A., Hutton, C., Mitchell, T., Corina, D., Liu, G., et al. (2000). Visual attention to the periphery is enhanced in congenitally deaf individuals. Journal of Neuroscience, 20(17) RC93, 16. Beer, J. S., & Ochsner, K. N. (2006). Social cognition: A multi level analysis. Brain Research, 1079(1), 98105. Bellugi, U., Lichtenberger, L., Jones, W., Lai, Z., & St. George, M. (2000). I. The neurocognitive prole of Williams syndrome: A complex pattern of strengths and weaknesses. Journal of Cognitive Neuroscience, 12(Suppl. 1), 729. Best, C. T., McRoberts, G. W., & Sithole, N. M. (1988). Examination of the perceptual reorganization for speech contrasts: Zulu click discrimination by Englishspeaking adults and infants. Journal of Experimental Psychology: Human Perception and Performance, 14(3), 345360. Bettger, J., Emmorey, K., McCullough, S., & Bellugi, U. (1997). Enhanced facial discrimination: Effects of experience with American Sign Language. Journal of Deaf Studies and Deaf Education, 2(4), 223233. Bosworth, R. G., & Dobkins, K. R. (1999). Left-hemisphere dominance for motion processing in deaf signers. Psychological Science, 10(3), 256262. Bosworth, R. G., & Dobkins, K. R. (2002). The effects of spatial attention on motion processing in deaf signers, hearing signers, and hearing nonsigners. Brain and Cognition, 49(1), 152169. Brooks, R., & Meltzoff, A. N. (2002). The importance of eyes: How infants interpret adult looking behavior. Developmental Psychology, 38(6), 958966. Brooks, R., & Meltzoff, A. N. (2008). Infant gaze following and pointing predict accelerated vocabulary growth through two years of age: A longitudinal, growth curve modeling study. Journal of Child Language, 35, 207220. Bross, M. (1979). Residual sensory capacities of the deaf: A signal detection analysis of a visual discrimination task. Perceptual and Motor Skills, 48(1), 187194. Bross, M., & Sauerwein, H. (1980). Signal detection analysis of visual icker in deaf and hearing individuals. Perceptual and Motor Skills, 51(3), 839843. Campbell, R. (1978). Asymmetries in interpreting and expressing a posed facial expression. Cortex, 14(3), 327 342. Campbell, R., Woll, B., Benson, P.J., & Wallace, S.B. (1999). Categorical perception of face actions: Their role in sign language and in communicative facial displays. Quarterly Journal of Experimental Psychology, 52A(1), 67 95. Carlson, S. M., & Meltzoff, A. N. (2008). Bilingual experience and executive functioning in young children. Developmental Science, 11(2), 282298. Chavajay, P., & Rogoff, B. (1999). Cultural variation in management of attention by children and their caregivers. Developmental Psychology, 35(4), 10791090.

963

Colombo, J., & Bundy, R. S. (1981). A method for the measurement of infant auditory selectivity. Infant Behavior and Development, 4, 219223. Corina, D. P., Bellugi, U., & Reilly, J. (1999). Neuropsychological studies of linguistic and affective facial expressions in deaf signers. Language and Speech, 42(23), 307331. Corina, D., Chiu, Y. S., Knapp, H., Greenwald, R., San Jose-Robertson, L., & Braun, A. (2007). Neural correlates of human action observation in hearing and deaf subjects. Brain Research, 1152, 111129. Corina, D. P., & Knapp, H. (2006). Sign language processing and the mirror neuron system. Cortex, 42(4), 529 539. Crume, P., & Singleton, J. L. (2008). Teacher practices for promoting visual engagement of deaf children in a bilingual preschool. Paper presented at the Association of College Educators of the Deaf Hard of Hearing, Monterey, CA. Curtiss, S. (1977). Genie: A psycholinguistic study of a modern day wild child. New York: Academic Press. Cutting, A., & Dunn, J. (1999). Theory of mind, emotion understanding, language and family background: Individual differences and inter-relations. Child Development, 70, 853865. Damasio, A. R. (1995). On some functions of the human prefrontal cortex. Annals of the New York Academy of Sciences, 769, 241251. Dye, M. W. G., Baril, D. E., & Bavelier, D. (2007). Which aspects of visual attention are changed by deafness? The case of the attentional network task. Neuropsychologia, 48(8), 18011811. Dye, M. W. G., Hauser, P. C., & Bavelier, D. (2008). Visual attention in deaf children and adults: Implications for learning environments. In M. Marschark & P. C. Hauser (Eds.), Deaf cognition: Foundations and outcomes (pp. 250263). New York: Oxford University Press. Eimas, P. D. (1975). Auditory and phonetic coding of the cues for speech: Discrimination of the [rl] distinction by young infants. Perception and Psychophysics, 18, 341 347. Ekman, P. (1992). Facial expressions of emotion: An old controversy and new ndings. Philosophical transactions of the Royal Society of London: Series B. Biological Sciences, 335(1273), 6369. Emmorey, K. (2002). Language, cognition, and the brain: Insights from Sign Language Research. Mahwah, NJ: Erlbaum. Emmorey, K., Kosslyn, S. M., & Bellugi, U. (1993). Visual imagery and visual-spatial language: Enhanced imagery abilities in deaf and hearing ASL signers. Cognition, 46(2), 139181. Erting, C. J., Prezioso, C., & Hynes, M. (1990). The interactional content of deaf motherinfant communication. In V. Volterra & C. Erting (Eds.), From gesture to language in hearing and deaf children (pp. 97106). New York: Springer-Verlag. Eslinger, P. J., & Grattan, L. M. (1991). Perspectives on the developmental consequences of early frontal lobe

964

Corina and Singleton opmental disabilities: Clinical research and practice (pp. 119131). New York: Guilford. Iacoboni, M., & Mazziotta, J. C. (2007). Mirror neuron system: Basic ndings and clinical applications. Annals of Neurology, 62(3), 213218. Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The fusiform face area: A module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17(11), 43024311. Kanwisher, N., & Yovel, G. (2006). The fusiform face area: A cortical region specialized for the perception of faces. Philosophical Transactions of the Royal Society of London: Series B. Biological Sciences, 361(1476), 21092128. Karmiloff-Smith, A. (2007). Williams syndrome. Current Biology, 17(24), R1035R1036. Klarman, L., Krentz, U., Brinkley, J., Corina, D. P., & Kuhl, P. (2008). Deaf and hearing infants preference for American Sign Language. Poster presented at American Psychological Association Conference, Boston. Knapp, H. P., Cho, H., & Corina, D. P. (2008). Perception of sign language and human actions. In M. R. de Quadros (Ed.), TISLR 9: Theoretical Issues in Sign Language ricos Research 9. 9 Congreso International de Aspectos Teo das Pesquisas nas Linguas de Sinais. December 6 to 9, 2006 polis, SC Universidade Federal de Santa Catarina Floriano polis: Lagoa Editora (2006). Brasil. Floriano Koenigs, M., Young, L., Adolphs, R., Tranel, D., Cushman, F., Hauser, M., et al. (2007). Damage to the prefrontal cortex increases utilitarian moral judgments. Nature, 446(7138), 908911. Koester, L. S. (1995). Face-to-face interactions between hearing mothers and their deaf or hearing infants. Infant Behavior and Development, 18(2), 145153. Koester, L. S., Brooks, L., & Traci, M. A. (2000). Tactile contact by deaf and hearing mothers during faceto-face interactions with their infants. Journal of Deaf Studies and Deaf Education, 5(2), 21272139. Krentz, U. C., & Corina, D. P. (2008). Preference for language in early infancy: The human language bias is not speech specic. Developmental Science, 11(1), 19. Ladd, P. (2003). Understanding deaf culture. In search of deafhood. Toronto, ON: Multilingual Matters. Lane, H. (1984). When the mind hears: A history of the deaf. New York: Random House. Liddell, S. K. (1980). American Sign Language syntax. The Hague: Mouton. Lieberman, M. D. (2006). Social cognitive neuroscience: A review of core processes. Annual Review of Psychology, 58, 259289. Lillo-Martin, D. (1999). Modality effects and modularity in language acquisition: The acquisition of American Sign Language. In T. K. Bhatia & W. C. Ritchie (Eds.), Handbook of language acquisition (pp. 531567). San Diego: Academic Press. Loots, G., & Devise, I. (2003). The use of visual-tactile communication strategies by deaf and hearing fathers and mothers of deaf infants. Journal of Deaf Studies and Deaf Education, 8(1), 3142.

damage: Introduction. Developmental Neuropsychology, 7, 257260. Everhart, V. S., & Marschark, M. (1997). Models, modules, and modality. In M. Marschark, P. Siple, D. Lillo-Martin, R. Campbell, & V. S. Everhart (Eds.), Relations of language and thought: The view from sign language and deaf children (pp. 172184). New York: Oxford University Press. Finney, E. M., & Dobkins, K. R. (2001). Visual contrast sensitivity in deaf versus hearing populations: Exploring the perceptual consequences of auditory deprivation and experience with a visual language. Cognitive Brain Research, 11(1), 171183. Frith, U. (2001). Mind blindness and the brain in autism. Neuron, 32(6), 969979. Gallese, V. (2007). Before and below theory of mind: Embodied simulation and the neural correlates of social cognition. Philosophical Transactions of the Royal Society of London: Series B. Biological Sciences, 362(1480), 659 669. Gallese, V., Fadiga, L., Fogassi, L., & Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain, 119(Pt. 2), 593609. Geers, A., Moog, J., & Schick, B. (1984). Acquisition of spoken and signed English by profoundly deaf children. Journal of Speech and Hearing Disorders, 49(4), 378388. Gioia, G. A., Isquith, P. K., Guy, S. C., & Kentworth, L. (2000). Behavior Rating Inventory of Executive Function (BRIEF) professional manual. Odessa, FL: Psychological Assessment Resources. Glenn, S. M., Cunningham, C. C., & Joyce, P. F. (1981). A study of auditory preferences in nonhandicapped infants and infants with Downs Syndrome. Child Development, 52(4), 13031307. Grossman, T., & Johnson, M. H. (2007). The development of the social brain in human infancy. European Journal of Neuroscience, 25(4), 909919. Harris, R. I. (1978). The relationship of impulse control to parent hearing status, manual communication, and academic achievement in deaf children. American Annals of the Deaf, 123(1), 5267. Harris, M. (2000). Social interaction and early language development in deaf children. Deafness and Education International, 2(1), 111. Harris, M., & Mohay, H. (1997). Learning to look in the right place: A comparison of attentional behavior in deaf children with deaf and hearing mothers. Journal of Deaf Studies and Deaf Education, 2(2), 95103. Hauser, P. C., Lukomski, J., & Hillman, T. (2008). Development of deaf and hard of hearing students executive function. In M. Marschark & P. C. Hauser (Eds.), Deaf cognition: Foundations and outcomes (pp. 286308). New York: Oxford University Press. Hauser, P. C., Lukomski, J., & Isquith, P. (2009). Deaf college students performance on the BRIEF-A and Connors. Manuscript in preparation. Hauser, P. C., Wills, K., & Isquith, P. K. (2006). Hard of hearing, deafness, and being deaf. In J. E. Farmer, J. Donders, & S. Warschausky (Eds.), Treating neurodevel-

Insights From Deafness Loots, G., Devise, I., & Jacquet, W. (2005). The impact of visual communication on the intersubjective development of early parentchild interaction with 18- to 24month-old deaf toddlers. Journal of Deaf Studies and Deaf Education, 10(4), 357375. MacPherson, S. E., Phillips, L. H., & Della Sela, S. (2002). Age, executive function and social decision making: A dorsolateral prefrontal theory of cognitive aging. Psychology and Aging, 17(4), 598609. Marschark, M. (1993). Psychological development of deaf children. New York: Oxford University Press. Martini, A., Mazzoli, M., & Kimberling, W. (1997). An introduction to the genetics of normal and defective hearing. Annals of the New York Academy of Sciences, 830, 361374. Materna, S., Dicke, P. W., & Thier, P. (2008). Dissociable roles of the superior temporal sulcus and the intraparietal sulcus in joint attention: A functional magnetic resonance imaging study. Journal of Cognitive Neuroscience, 20(1), 108119. McCarthy, G. (1999). Physiological studies of face processing in humans. In M. S. Gazzaniga & E. Bizzi (Eds.), The new cognitive neurosciences (pp. 393410). Cambridge, MA: MIT Press. McCullough, S., Emmorey, K., & Sereno, M. (2005). Neural organization for recognition of grammatical and emotional facial expressions in deaf ASL signers and hearing nonsigners. Brain Research. Cognitive Brain Research, 22(2), 193203. McKee, D. E. (1987). An analysis of specialized cognitive functions in deaf and hearing signers. Unpublished doctoral dissertation, University of Pittsburgh, Pittsburgh. Meadow, K. P. (1976). Behavioral problems of deaf children. In H. S. Schlesinger & K. P. Meadow (Eds.), Studies of family interaction, language acquisition, and deafness (pp. 257293). San Francisco: San Francisco Final Report, Ofce of Maternal and Child Health, Bureau of Community Health Services. Meltzoff, A. N., & Brooks, R. (2007). Intersubjectivity before language: Three windows on preverbal sharing. ten (Ed.), On being moved: From mirror In S. Bra neurons to empathy (pp. 149174). Philadelphia: John Benjamins. Meltzoff, A. N., & Brooks, R. (2009). Social cognition and language: The role of gaze following in early word learning. In J. Colombo, P. McCardle, & L. Freund (Eds.), Infant pathways to language: Methods, models, and research disorders (pp. 169194). New York: Psychology Press. Meristo, M., Falkman, K. W., Hjelmquist, E., Tedoldi, M., Surian, L., & Siegal, M. (2007). Language access and theory of mind reasoning: Evidence from deaf children in bilingual and oralist environments. Developmental Psychology, 43(5), 11561169. Meyer-Lindenberg, A., Hariri, A. R., Munoz, K. E., Mervis, C. B., Mattay, V. S., Morris, C. A., et al. (2005). Neural correlates of genetically abnormal social cognition in Williams syndrome. Nature Neuroscience, 8(8), 991993.

965

Miall, R. C. (2003). Connecting mirror neurons and forward models. Neuroreport, 14(17), 21352137. Mills, C. B. (1985). Perception of visual temporal patterns by deaf and hearing adults. Bulletin of the Psychonomic Society, 23, 483486. Mitchell, T.V. (1996). How audition shapes visual attention. Unpublished doctoral dissertation, Indiana University, Bloomington. Mitchell, R. E., & Karchmer, M. A. (2004). When parents are deaf versus hard of hearing: Patterns of sign use and school placement of deaf and hard-of-hearing children. Journal of Deaf Studies and Deaf Education, 9(2), 133152. Mitchell, T. V., & Quittner, A. L. (1996). Multimethod study of attention and behavior problems in hearingimpaired children. Journal of Clinical Child Psychology, 25(1), 8396. Mosconi, M. W., Mack, P. B., McCarthy, G., & Pelphrey, K. A. (2005). Taking an intentional stance on eyegaze shifts: A functional neuroimaging study of social perception in children. Neuroimage, 27(1), 247252. Mundy, P. C., & Acra, C. F. (2006). Joint attention, social engagement, and the development of social competence. In P. J. Marshall & N. A. Fox (Eds.), The development of social engagement: Neurobiological perspectives (pp. 81117). New York: Oxford University Press. Neville, H., & Bavelier, D. (2002). Human brain plasticity: Evidence from sensory deprivation and altered language experience. Progress in Brain Research, 138, 177 188. Neville, H. J., & Lawson, D. (1987a). Attention to central and peripheral visual space in a movement detection task: An event-related potential and behavioral study. I. Normal hearing adults. Brain Research, 405, 253267. Neville, H. J., & Lawson, D. (1987b). Attention to central and peripheral visual space in a movement detection task: An event-related potential and behavioral study. II. Congenitally deaf adults. Brain Research, 405, 268283. Neville, H. J., & Lawson, D. (1987c). Attention to central and peripheral visual space in a movement detection task. III. Separate effects of auditory deprivation and acquisition of a visual language. Brain Research, 405(2), 284294. Neville, H. J., Schmidt, A., & Kutas, M. (1983). Altered visual-evoked potentials in congenitally deaf adults. Brain Research, 266(1), 127132. Newman, A. J., Bavelier, D., Corina, D., Jezzard, P., & Neville, H. J. (2002). A critical period for right hemisphere recruitment in American Sign Language processing. Nature Neuroscience, 5(1), 7680. Newport, E. L. (1990). Maturational constraints of language learning. Cognitive Science, 14(1), 1128. Newport, E. L., & Meier, R. P. (1985). The acquisition of American Sign Language. In D. I. Slobin (Ed.), The cross-linguistic study of language acquisition (pp. 881 938). Hillsdale, NJ: Erlbaum. Niemeyer, W., & Starlinger, I. (1981). Do the blind hear better? Investigations on auditory processing in

966

Corina and Singleton development of visual attention. Psychological Science, 5(6), 347353. Reilly, J. S., & Bellugi, U. (1996). Competition on the face: Affect and language in ASL motherese. Journal of Child Language, 23(1), 219239. Reilly, J. S., McIntire, M., & Bellugi, U. (1990a). The acquisition of conditionals in American Sign Language: Grammaticized facial expressions. Applied Psycholinguistics, 11(4), 369392. Reilly, J. S., McIntire, M., & Bellugi, U. (1990b). Faces: The relationship between language and affect. In V. Volterra & C. J. Erting (Eds.), From gesture to language in hearing and deaf children (pp. 128141). New York: Springer-Verlag. Reivich, R. S., & Rothrock, I. A. (1972). Behavior problems of deaf children and adolescents: A factor-analytic study. Journal of Speech, Language, and Hearing Research, 15(1), 93104. Ries, P. W. (1994). Prevalence and characteristics of persons with hearing trouble: United States, 1990-91. Vital and Health Statistics, 10(188), 175. Rizzolatti, G., Fadiga, L., Gallese, V., & Fogassi, L. (1996). Premotor cortex and the recognition of motor actions. Brain Research. Cognitive Brain Research, 3(2), 131141. Rogoff, B., Mistry, J., Goncu, A., & Moisier, C. (1993). Guided participation in cultural activity by toddlers and caregivers. Monographs of the Society for Research in Child Development, 58(8, Serial No. 236). Roth, R. M., Isquith, P. I., & Gioia, G. A. (2005). Behavior Rating Inventory of Executive Function-Adult Version. Lutz, FL: Psychological Assessment Resources. Sandler, W., & Lillo-Martin, D. (2006). Sign language and linguistic universals. New York: Cambridge University Press. Saxe, R. (2006). Uniquely human social cognition. Current Opinion in Neurobiology, 16(2), 235239. Saxe, R., Carey, S., & Kanwisher, N. (2004). Understanding other minds: Linking developmental psychology and functional neuroimaging. Annual Review of Psychology, 55, 87124. Schick, B., de Villiers, P., de Villiers, J., & Hoffmeister, R. (2007). Language and theory of mind: A study of deaf children. Child Development, 78(2), 376396. Schick, B., & Hoffmeister, R. (2001). ASL skills in deaf children of deaf parents and of hearing parents. Paper presented at the, Society of Research on Child Development International Conference, Minneapolis, MN. Schick, B., & Moeller, M. P. (1992). What is learnable in manually coded English sign systems? Applied Psycholinguistics, 13(3), 313340. Smith, L. B., Quittner, A. L., Osberger, M. J., & Miyamoto, R. (1998). Audition and visual attention: The developmental trajectory in deaf and hearing populations. Developmental Psychology, 34(5), 840850. Stokoe, W. (2005). Sign language structure: An outline of the visual communication systems of the American

congenital or early acquired blindness. II. Central functions. Audiology, 20(6), 510515. Oberg, E. (2007). Assessing executive functioning in children with a hearing loss. Unpublished masters thesis, Rochester Institute of Technology, Rochester, NY. Oberman, L. M., & Ramachandran, V. S. (2007). The simulating social mind: The role of the mirror neuron system and simulation in the social and communicative decits of autism spectrum disorders. Psychological Bulletin, 133(2), 310327. Padden, C. A., & Humphries, T. L. (1988). Deaf in America: Voices from a culture. Cambridge, MA: Harvard University Press. Padden, C., & Humphries, T. L. (2005). Inside deaf culture. New York: Harvard University Press. Parasnis, I., & Samar, V. J. (1985). Parafoveal attention in congenitally deaf and hearing young adults. Brain and Cognition, 4(3), 313327. Parasnis, I., Samar, V. J., & Berent, G. P. (2003). Deaf adults without attention decit hyperactivity disorder display reduced perceptual sensitivity and elevated impulsivity on the Test of Variables of Attention (T.O.V.A.). Journal of Speech, Language, and Hearing Research, 46(5), 11661183. Pelphrey, K. A., Singerman, J. D., Allison, T., & McCarthy, G. (2003). Brain activation evoked by perception of gaze shifts: The inuence of context. Neuropsychologia, 41(2), 156170. Peterson, C., & Siegal, M. (1995). Deafness, conversation and theory of mind. Journal of Child Psychology & Psychiatry, 36, 459474. Peterson, C. C., Wellman, H. M., & Liu, D. (2005). Steps in theory-of-mind development for children with deafness or autism. Child Development, 76(2), 502517. Petit, C. (1996). Genes responsible for human hereditary deafness: Symphony of a thousand. Nature Genetics, 14(4), 385391. Poizner, H., & Tallal, P. (1987). Temporal processing in deaf signers. Brain and Language, 30(1), 5262. Posner, M. I., & Rothbart, M. K. (2000). Developing mechanisms of self-regulation. Development and Psychopathology, 12(3), 427441. Posner, M. I., & Rothbart, M. K. (2007). Research on attention networks as a model for the integration of psychological science. Annual Review of Psychology, 58, 123. Proksch, J., & Bavelier, D. (2002). Changes in the spatial distribution of visual attention after early deafness. Journal of Cognitive Neuroscience, 14(5), 687701. Puce, A., Allison, T., Bentin, S., Gore, J. C., & McCarthy, G. (1998). Temporal cortex activation in humans viewing eye and mouth movements. Journal of Neuroscience, 18(6), 21882199. Quittner, A. L., Glueckauf, R. L., & Jackson, D. N. (1990). Chronic parenting stress: Moderating versus mediating effects of social support. Journal of Personality and Social Psychology, 59(6), 12661278. Quittner, A. L., Smith, L. B., Osberger, M. J., Mitchell, T. V., & Katz, D. B. (1994). The impact of audition on the

Insights From Deafness Deaf. Journal of Deaf Studies and Deaf Education, 10, 337 (Original work published 1960). Strong, M., & Prinz, P. M. (1997). A study of the relationship between American Sign Language and English literacy. Journal of Deaf Studies and Deaf Education, 2, 3746. Vaughan van Hecke, A., Mundy, P., Acra, C. F., Block, J., Delgado, C., Parlade, M., et al. (2007). Infant joint attention, temperament, and social competence in preschool children. Child Development, 78(1), 5369. de Villiers, J. (2005). Can language acquisition give children a point of view? In J. Astington & J. Baird (Eds.), Why language matters for theory of mind (pp. 186219). Oxford, UK: Oxford University Press. de Villiers, J., & de Villiers, P. (2000). Linguistic determinism and the understanding of false beliefs. In P. Mitchell & K. Riggs (Eds.), Childrens reasoning and the mind (pp. 189226). Hove, UK: Psychology Press. de Villiers, P. (2003). Language of the deaf-acquisition of English. In R. D. Kent (Ed.), The MIT encyclopedia of communicative disorders (pp. 336338). Cambridge, MA: MIT Press. de Villiers, P. (2005). The role of language in theory of mind development: What deaf children tell us. In J. Astington & J. Baird (Eds.), Why language matters

967

for theory of mind (pp. 266297). Oxford, UK: Oxford University Press. Vouloumanos, A., & Werker, J. F. (2004). Tuned to the signal: The privileged status of speech for young infants. Developmental Science, 7(3), 270276. Waxman, R. P., & Spencer, P. E. (1997). What mothers do to support infant visual attention: Sensitivities to age and hearing status. Journal of Deaf Studies and Deaf Education, 2(2), 104114. Weaver, K. E., & Stevens, A. A. (2006). Auditory gap detection in the early blind. Hearing Research, 211(12), 16. Werker, J. F., & Tees, R. C. (1984). Cross-language speech perception: Evidence for perceptual reorganization during the rst year of life. Infant Behavior and Development, 25(1), 4963. Williams, J. H., Whiten, A., Suddendorf, T., & Perrett, D. I. (2001). Imitation, mirror neurons and autism. Neuroscience and Biobehavioral Reviews, 25(4), 287295. Woodward, J. (1976). Historical Bases of American Sign Language. In P. Siple (Ed.), Understanding language through Sign language research (pp. 333348). New York: Academic Press. Zukow-Goldring, P., & Arbib, M. (2007). Affordances effectiveness and assisted imitation: Caregivers and the directing of attention. Neurocomputing, 70(1315), 2181 2193.

You might also like