You are on page 1of 26

Automatic Mimicry Reactions

as Related to Differences in Emotional Empathy


Marianne Sonnby-Borgstrm Department of Psychology, Lund University, Sweden

ABSTRACT
The hypotheses of this investigation were based on conceiving of automatic mimicking as a component involved in emotional empathy. Differences between subjects high and low in emotional empathy were investigated. The parameters compared were facial mimicry reactions, as represented by electromyographic (EMG) activity when subjects were exposed to pictures of angry or happy faces, and the degree of correspondence between subjects facial EMG reactions and their self-reported feelings. The comparisons were made at different stimulus exposure times in order to elicit reactions at different levels of information processing. The high-empathy subjects were found to have a higher degree of mimicking behavior than the low-empathy subjects, a difference that emerged at short exposure times (17- 40 milliseconds) that represented automatic reactions. The low-empathy subjects tended already at short exposure times (17-40 ms) to show inverse zygomaticus muscle reactions, smiling when exposed to an angry face. The high-empathy group was characterized by a significantly higher correspondence between facial expressions and self-reported feelings. No differences were found between the high- and low-empathy subjects in their verbally reported feelings when presented a happy or an angry face. Thus, the differences between the groups in emotional empathy appeared to be related to differences in automatic somatic reactions to facial stimuli rather than to differences in their conscious interpretation of the emotional situation.

Key words: empathy, emotional contagion, facial expression, automatic reactions, microgenesis, unconscious processing.

INTRODUCTION
Facial mimicry and communication of emotion
The experimental and theoretical literatures have failed to agree on a single definition of empathy. Levenson, who represents an experimental approach to the concept, refers in review articles to at least three different qualities that have been ascribed to empathy: a/ Knowing what another person is feeling (empathic accuracy), b/ Feeling what another person is feeling, and c/ Responding compassionately to another person's distress (Levenson, 1996; Levenson & Ruef, 1992). The present study focuses on the second aspect of empathy, termed here emotional empathy. Within a psychoanalytical framework, Basch (1983) conceives of emotional contagion as being an important component of empathy. The idea of somatic mimicry is central to Baschs notion of empathy. In line with the view on facial expressions proposed by Tomkins (Tomkins, 1962; Tomkins, 1991), Basch assumes facial expressions to be the efferent part of a biologically anchored system of basic affects and to be a part of a preprogrammed or prewired form of communicative competence. Basch writes, A given affective expression of one member of a particular species tends to recruit a similar response in other members of that species.... This is done through the promotion of an unconscious, automatic, and in adults not necessarily obvious, imitation of the senders bodily state and facial expression by the receiver. This then generates in the receiver the autonomic response associated with that bodily state and facial expression, which is to say that the receiver experiences, an affect identical with that of the sender (Basch, 1983, p. 108). Although the emotional somatic reactions is supposed to be the starting point of the empathetic process, empathetic knowledge of the other person is supposed to involve components of cognitive interpretation as well (Basch, 1976; Hoffman, 1984; Holm, 1985 Eisenberg & Fabes, 1990). In accordance with Baschs notion of the process leading to emotional empathy, experimentally oriented psychologists assert the hypothesis of emotional contagion. This term is defined as the tendency to mimic the verbal, physiological and/or behavioral aspects of another persons emotional experience, and thus to express/experience the same emotions oneself (Hsee, Hatfield, Carlsson, & Chetomb, 1990, p. 328). Evidence of facial mimicry has been reported in a number of studies (Dimberg, 1982; Dimberg, 1989; Dimberg & Karlsson, 1997; Kappas, Hess, & Banse, 1992; Vaughan & Lanzetta, 1980; Zajonc, Adelmann, Murphy, & Niedenthal, 1987). The view that this somatic mimicry is a crucial part of emotional contagion is shared by various investigators studying facial expressions and

emotions experimentally (Bavelas, Black, Lemery, & Mullett, 1986; Chartrand & Bargh, 1999; Hatfield, Cacioppo, & Rapson, 1994; Hoffman, 1984; Hsee, Hatfield, & Chemtob, 1992; Laird et al., 1994; Lundqvist, 1995). Several studies confirm a relation between facial expressions, autonomic activity and the experience of emotion, but the mechanisms behind this correlation are still under debate (Porges, 1991). Some researchers (Burgoon, Buller, & Woodall, 1996; Ekman, Levenson, & Friesen, 1983; Hess, Kappas, McHugo, Lanzetta, & Klerck, 1992; Izard, 1971; Lanzetta & Kleck, 1976; Tomkins, 1984) suggest that the facial muscle activity provides proprioceptive information (afferent facial feedback) and that the facial expression can influence the internal emotional experience. Combining facial mimicry with the afferent facial feedback hypothesis has resulted in the interpersonal facial feedback hypotheses (IFFH), which may possibly help explain the mechanisms behind emotional contagion (Capella, 1993). Despite the various studies cited above which support the idea of a connection between internal emotional states and facial expressions, problems connected with this simple hypothesis have been indicated. A major controversy concerns the respective degree of influence of internal affective states versus conscious cognitive and contextual factors on facial expressions (Hess, Philippot, & Blairy, 1998; Izard, 1990; Matsumoto, 1987; Hess, Banse, & Kappas, 1995). In studying facial displays it is thus important to also consider conscious cognitive factors and individual differences in emotional regulation (Ginsburg, 1997; Hess et al., 1995; Hess et al., 1998; McHugo & Smith, 1996; Tassinary & Cacioppo, 1992; Vrana & Rollock, 1998). One solution to this controversy could be to adopt a processoriented perspective, studying facial expressions at different levels of information processing so as to compare reactions at different levels of conscious cognitive control.

Automatic and controlled levels of processing


In line with the idea of there being qualitatively different stages of information processing, Leventhal (1984) formulated the perceptual motor model of emotion, implying the

existence of three different hierarchically organized levels of emotional response (Leventhal, 1984). A similar proposal of different stages in the information processing of emotional stimuli has been formulated by hman (hman, 1993). hmans model, however, is primarily concerned with evolutionarily relevant stimuli that evoke fear and anxiety. The first and most basic level of the affect program is assumed to be inherited and to be biologically prepared. The response at this level is considered to be either physiological or automatic

motor in character and evoked automatically by specific stimuli, without previous learning. The second level is conceived of as involving a separate memory system (first memory system), one which is evoked automatically. It constitutes a schematic, prototypical level of emotional processing that is regarded as representing a conditioned emotional response. The third stage in Leventhals model finally, at a secondary memory level is a system that makes a conscious, reflective evaluation of the emotional situation. It involves controlled or regulated reactions rather than spontaneous emotional reactions (Leventhal, 1984). The existence of a preconscious or automatic level in perceptual/cognitive processes is supported both by psychological experimental research and by recent neurological work demonstrating that the affective reactions may be evoked before the conscious identification of the stimulus (Brown, 1988; Dimberg & hman, 1996; Dixon, 1981; LeDoux, 1996; Pally, 1998; Tassinary, Scott, Wolford, Napps, & Lanzetta, 1984; Zajonc, 1980; hman & Dimberg, 1978). At later levels of processing subcortical emotional activation is assumed to be modulated by neocortical structures. The controversy concerning the relative degree of influence which internal spontaneous affects versus cognitive and contextual factors have on facial expressions is of interest here. In previous research on facial expressions, little attention has been directed at the time dimension and at different levels of processing. Thus, a process-oriented design was considered to be fruitful in this context.

BASIC ASSUMPTIONS AND AIMS


The design selected, which was inspired by percept-genetic research and methodology (Kragh & Smith, 1970), aimed at distinguishing facial reactions at different levels of information processing. The basic assumption of the theory on which the method was based is that in the course of a percept-genesis the more objective and conscious world around us develops as growing out of a subjective and subconscious personal core (Smith, 1991). Percept genetic methodology, in turn is based on the theory of microgenesis. In terms of this theory, the perceptual act is a process that evolves through a series of qualitatively different stages, which unfold over time, from microseconds to seconds (Brown, 1985; Brown, 1988). In the present study different levels of consciousness in information processing were induced by the successively prolonged exposure times of facial stimuli, starting with very short exposure times (17 ms) assumed to elicit automatic reactions at a preattentive level, continuing on to

longer exposure times representing conscious information processing and more controlled reactions. The major aim of the present study was to examine how facial mimicry behavior in face-to-face interaction situations is related to individual differences in emotional empathy at different levels of information processing. Automatic mimicry was expected to start already at very short exposure times involving automatic or preattentive processing (Dimberg, Thunberg, & Elmehed, 2000). Such automatic reactions at short exposure times are assumed here to be linked to emotional empathy, a higher level of emotional empathy being linked with stronger automatic mimicry reactions. A secondary aim of the study was to investigate the correspondence between facial muscle reactions and verbally reported feelings and to relate the degree of correspondence to individual differences in emotional empathy. Subjects high in emotional empathy were expected to show a higher degree of correspondence between muscle activity and reported feelings than subjects low in emotional empathy. A third aim was to investigate differences between high- and low- empathy subjects in self-reported feelings when exposed to angry as well as to happy faces.

METHOD
Participants
Twenty-two women and twenty-one men, students from different departments at the Lund University, participated in the experiment on a volunteer basis. The median age was 23 years (range 19-37).

Materials
Pictures of facial expressions taken from Ekman and Friesens Unmasking the Face (Ekman & Friesen, 1975) were used as stimuli representing the senders side in a face-to-face interaction situation. Digitized and saved as grey-scale picture files, the pictures were exposed on a computer monitor. Four faces, two of males and two of females, showing either an angry or a happy expression, were selected. Pictures of the same person were used both for the happy and the angry expressions. A picture of a vase served as neutral stimulus. A nonfigurative grey-scale masking picture was presented for a duration of 50 ms immediately after presentation of a target picture to assure that preattentive processing took place (Esteves & hman, 1993). It was shown to the subject prior to the start of the experiment so as to facilitate its being processed in a controlled way during the experiment.

Procedure
All subjects were exposed to one angry face, one neutral stimulus and one happy face. The neutral picture was always in the second position. In order to compensate for position effects the exposure sequence was balanced so that half the subjects looked at the angry face first (50 % at a male and 50 % at a female picture) and half at the happy (50 % at a male and 50% at a female) face first. Thus, the design was balanced with regard both to the facial expression and to the gender of the stimulus face. The pictures displaying a facial expression and the picture of neutral content were each shown to the subject at 14 different exposure times, prolonged successively from 17 milliseconds to 6 seconds (17 ms, 25 ms, 30 ms, 35 ms, 40 ms, 45 ms, 50 ms, 75 ms, 100 ms, 150 ms, 200 ms, 500 ms, 1000 ms, and 6000 ms). For technical reasons, 17 ms was the shortest time possible for presentation on the computer monitor. Together, the shortest exposure time and the masking picture were expected to assure preattentive processing. Processing during an exposure of 6000 ms is certainly of a controlled type. Each stimulus was exposed 6 times at each exposure time (called a set of 6 stimulus exposures) so as to increase the accuracy of the measurements. The stimulus interval between these six exposures was 500 ms. In earlier experiments a delay of about 300 ms from onset of the stimulus to the facial muscle reaction has been observed (Dimberg, 1997a).

Measures and instruments


EMG-reactions Electromyography (EMG) was used to register facial reactions. Informing the subjects that sweat gland activity in the face was being measured masked the main purpose of the experiment, that of facial EMG-registration. The reason for using EMG recordings rather than for example the Facial Action Coding System (Ekman & Friesen, 1978) as the dependent measure of facial muscle reactions was that the reactions were expected to be weak, hardly discernible by observation. Reactions of this type can only be registered by use of EMG (Tassinary & Cacioppo, 1992). Positive emotions (smiling-reactions) were indicated by registrations of electric activity in zygomaticus major and negative emotions (frownreactions) were indicated by the electric activity in corrugator supercilii (Hjortsj, 1970; Dimberg, 1982; Tassinary & Cacioppo, 2000). Bipolar electrodes attached to the left side of the face, using an inter-electrode distance of about 1. 5 cm, were employed. These were placed in accordance with instructions in Guidelines for Human Electromyograhic Research (Fridlund & Cacioppo, 1986). The

sampling rate selected was 100 Hz, one that led to a sub-sampling of the signal. The presampling filter was set to the frequency range of 100 to 4000 Hz. Shielded Ag/AgCl miniature surface electrodes (Biopac, EL 208 S) filled with Biopac electrode gel were used to measure EMG activity. The subjects skin was cleaned with alcohol before the electrodes were applied. These were connected to Biopac (EMG 100A) amplifiers, the digitized (through the use of a Biopac MP 100 A system) EMG-signals that were registered and being stored by use of special software for bioelectric data handling (AcqKnowledge).

Self-reported feelings and identification of facial expression Subjects were instructed to write a short description of what they had seen after each set of 6 stimulus exposures. This made it possible to distinguish a preattentive level of processing, which was defined as those exposures preceding the exposure-time at which the subject was able to recognize the facial expression. The subject was also instructed to estimate his/her feeling (self-reported feelings) after each set of 6 stimulus exposures, using a scale containing six alternative descriptions of the feelings - negative, slightly negative, no feelings, both positive and negative feelings, slightly positive and positive.

Questionnaires Following the experiment involving exposure of facial expressions, the subjects were given three different tests to complete: the Questionnaire Measure of Emotional Empathy (QMEE) and Spielbergers State-Trait Anxiety Inventory (STAI). This testing was carried out after the completion of the experiment in order to minimize the risk of behavior in the experiment being influenced by the filling out of these questionnaires. Normative data on the QMEE test is described by Choplan et al. (Choplan, McCain, Carbonell, & Hagen, 1985). The QMEEscale provides a measure of emotional empathy and is not designed to measure cognitive aspects of empathy. Normative data on the STAI is presented in Spielbergers manual (Spielberger, 1983). STAI was used to control for effects of individual differences in anxiety. It had been shown earlier that the individual level of anxiety or induced fear could effect facial expressions (Dimberg, 1997b; McHugo & Smith, 1996).

Methods of data reduction and statistical analysis


Mimicking at different levels of processing The strength of the reactions of the muscles to a given stimulus at a particular exposure time was calculated as the mean amplitude of the signal from the onset of the first exposure to the end of the sixth exposure (a set of six stimulus exposures, p.8). The AcqKnowledge program was used to calculate the standard deviation of the signal during the time period selected. This parameter corresponds to the power of the signal (root-mean-square voltage, rms) and is a measure of the strength of the EMG-activity (Fridlund & Cacioppo, 1986). As a result of these calculations each subject was assigned 14 mean values for corrugator activity and 14 mean-values for zygomaticus activity, both for exposure to the happy face and for exposure to the angry face. Thus, for each of the 14 different exposure times 4 EMG-activity means (2 Muscles x 2 Faces) were obtained for each participant. Data for the neutral stimulus was not used in these calculations. (See Methodological limitations below). To simplify and focus the calculations and the interpretation of the result, the 14 exposure times were grouped into four categories, or information-processing levels, termed the preattentive (subjective threshold), the automatic (17-30/40 ms), the medium (35/45-75 ms), and the controlled level (100-1000ms). Data were analyzed in repeated measures ANOVAs which included all the subjects (Faces x Muscles x Emotional empathy) with Faces (Happy and Angry) and Muscles (Zygomaticus and Corrugator), respectively serving as within-group factors and Emotional empathy (high and low) as the between-group factor. These analyses were performed at each level of processing. Thus, the reaction for each individual when exposed to the happy face at the preattentive level was compared with the reaction when exposed to the angry face presented at the preattentive level, and so on. Two-way interactions and simple effects were also analyzed in repeated measures ANOVAs. A significant interaction found in an ANOVA (Faces x Muscles), including either all participants or the empathy-groups separately, if in the expected direction, was interpreted as support for a mimicking reaction. The expected directions were an increase in activity of the corrugator muscle (frowning) upon exposure to the angry face as compared to the happy face, and an increase in activity of the zygomaticus muscle (smiling) upon exposure to the happy face as compared with exposure to the angry face.

Correspondence between self-reported feelings and muscle activity After each set of 6 exposures of a given stimulus (see Procedure p. 8), subjects were instructed to report their feelings (see Self reported feelings and identification of facial expression p. 10). The reported feelings were coded in different categories ranging from 1 to 3. Negative feelings and Slightly negative feelings, were coded as 1,Neutral and both positive and neutral feelings were coded as 2 and Slightly positive and Positive feelings were coded as 3. Thus, the ratings involved three values indicating different steps of experienced feelings. For each of these 3 parameters two mean values for the level of muscle activity (zygomaticus and corrugator) were calculated. This calculation was made independent of the stimulus and of the exposure time. Accordingly, each subject was assigned 3 different values for mean corrugator activity, one for each emotional level and similarly 3 different values for mean zygomaticus activity. The interaction between Self-reported feelings, Muscles and Emotional empathy, as well as two-way interactions and simple effects were analyzed in repeated measure ANOVAs. The interaction between Self-reported feelings and Muscles were also analyzed separately in each empathy-group.

RESULTS
Questionnaires
Forty-two persons completed the questionnaires. One subject was unable to complete them due to difficulties in understanding Swedish. The mean QMEE value was 51.9 points (SD = 23). This is high compared with results obtained for North American norm groups (33 points) (Choplan et al., 1985). The norm group was a randomly selected sample, whereas the sample in this experiment was comprised of students, mainly students of the behavioral sciences. Subjects were divided into two groups: one low-empathy group and one highempathy group that represented the remainder of the subjects. Fifteen subjects scoring 46 or below on the QMEE-scale were included in the low-empathy group and 27 subjects scoring higher than 46 in the high-empathy group. The mean score for the low- empathy group was 29.3 points (SD =17.5), and the mean score for the high-empathy group 64.5 (SD =14.7). The cutoff point between the low- and high-empathy groups was placed at a lower level than the mean value for the sample due to the groups mean being high compared with the norm group, about 1/3 of the participants thus being classified as low-empathy subjects.

The mean score for STAI (S-Anxiety) was 31.2 (SD = 7.8) and that for STAI (TAnxiety) was 37.3 (SD = 8.4). The S-anxiety means for the North American college norm groups are 36.5 (SD =10.0) for males and 38.8 (SD = 12.0) for females, the corresponding Tanxiety means for males being 38.3 (SD =9.2) and 40.4 (SD = 10.2) for females (Spielberger, 1983).

Mimicking at different levels of processing


Preattentive level The method of operationalising the preattentive level used in the present design was based on the subjects verbal report when exposed to the stimulus, the exposure time being coded as preattentive if the subject was unable to recognize the facial expression. This sort of threshold, termed the subjective threshold, is usually higher than the objective threshold, which is based on detection or discrimination guessing (Eysenck & Keane, 1995). Thirteen subjects identified the happy face already at the first exposure time (17 ms) and could thus not be included in the analysis of the preattentive level. The mean muscle activity for both muscles at the preattentive level was calculated for both the angry and the happy face for the 30 subjects included in the analysis. No significant interaction effects were found at this information processing level in a repeated measures ANOVA (Faces x Muscles x Emotional empathy) using Faces (happy and angry) and Muscles (zygomaticus and corrugator) as within group factors and Emotional empathy (high and low) as a between group factor. No significant mimicking reactions was found in a two way repeated measures ANOVA (Faces x Muscles) including all subjects or in ANOVAs including either the high- or low- empathy group.

Automatic level An automatic level of processing needs to be distinguished from processing at a subliminal or preattentive level. Automatic reactions are traditionally defined as processes that consume no attentional capacity, are under the control of stimuli rather than of intention, and occur outside awareness (Eysenck & Keane, 1995). Thus, the second processing and reaction level was called the automatic level and could be considered as a first memory or classical conditioning level (Leventhal, 1984). For the happy face the first three exposure times (17- 30 ms) were

10

selected to represent the automatic level, and for the angry face the first five exposures (17-40 ms) were selected. The cutoff points were the exposure times when 50 % of the subjects had identified the facial expressions, that is the median exposure time for identification of stimulus. The difference between the two stimuli here was due to subjects generally identifying the happy face more rapidly than the angry one. Thus, this level, termed here automatic, includes both the preattentive/subliminal automatic processing and automatic processing at a short, but supraliminal level. A repeated measure ANOVA (Faces x Muscles x Emotional empathy), all subjects included, with Faces (2 levels) and Muscles (2 levels) as within group factors and Emotional empathy (two levels: high and low) as a between group factor was performed. The result supported an interaction between Faces x Muscles x Emotional empathy, F (1, 40) = 4.88, p < 0.05 (cf. Fig.1). The figure shows differences in mean muscle activity between exposures to happy and angry faces for high- and low-empathy subjects at the automatic level.

V
4 3 2 1 0 -1 -2 -3 -4

Low-empathy subjects High-empathy subjects

zygomaticus corrugator

Fig. 1. Automatic level (17- 30/40 ms): Differences in mean muscle activity between exposure to happy and angry faces for high-empathy subjects and for low-empathy subjects. Zygomatic activity = positive emotions, Corrugator = negative emotions. Positive values indicate expected direction (mimicry reactions).

A repeated-measures ANCOVA (automatic exposure level) with covariates that possibly might be responsible for the interaction between Faces, Muscles and Emotional empathy was carried out. Sex as a covariate in an ANCOVA (Faces x Muscles x Emotional empathy x Sex) resulted in F (1, 39) =5.99, p < 0.05. Using Trait-anxiety as a covariate resulted in F (1, 39) = 4.98, p < 0.05 and State-anxiety as a covariate resulted in F (1, 39) = 6.55, p < 0.05. The last covariate was the Order of Exposure, indicating whether the subject was exposed to the

11

angry or the happy face first. This ANCOVA resulted in F (1, 39)= 4.16, p = 0.05. Thus, none of these covariates could explain the interaction between Emotional empathy, Muscles and Faces. An ANOVA (Faces x Emotional empathy) for the zygomaticus muscle reaction reached significance, F (1, 40) = 4.67, p < 0.05, Emotional empathy while the corresponding ANOVA for the corrugator muscle reaction did not yield significance, F (1, 40) = 1.85, p< 0.18. The low- empathy subjects showed a tendency to smile when exposed to the angry face, whereas the high-empathy subjects, in line with expectations, diminished their zygomaticus activity (smiling reaction) when exposed to the angry face compared to the happy face. A repeated measures ANOVA (Faces x Muscles), all subjects included, did not reach significance at this level (17-30/40 ms). An ANOVA (Faces x Muscles) which included only subjects in the high-empathy group reached borderline significance: F (1, 26) = 3,76, p = 0,06. The high-empathy subjects showed a tendency to smile at happy faces and to frown at angry ones, i. e. a tendency to mimicking behavior. Results of a corresponding ANOVA (Faces x Muscles) for the low-empathy group did not support mimicking behavior for that group.

Medium level The third response level was represented by mean values of EMG-activity during exposure times of medium length (35/45-75 ms). This level of processing was termed the medium level. The cutoff point (75 ms) for this level was selected as the time at which all subjects except three had correctly identified the facial expression. Three subjects required extremely long exposure times for identifying the stimulus. An ANOVA (Faces x Muscles x Emotional empathy), all subjects included, supported an interaction between Muscles, Faces and Emotional empathy, F (1, 40)=10.52, p < 0.01, (cf. Fig. 2). The figure shows differences in mean muscle activity between exposures to happy and angry faces for high- and low-empathy subjects at the medium level. Results of ANCOVAs analogous to those for the automatic level (Sex, Anxiety and Order of Exposure as covariates) provided no explanation for the interaction (Muscles x Faces x Emotional empathy). An ANOVA (Faces x Emotional empathy), in which only zygomaticus muscle was included, reached significance F (1, 40) = 6.48, p = 0.01, whereas the corresponding ANOVA (Faces x Emotional empathy) for the corrugator muscle reaction reached borderline significance, F (1, 40) = 3.49, p< 0.07. The low-empathy subjects showed a tendency to smile when exposed to the angry face while the high-empathy subjects

12

diminished their zygomaticus activity (smiling-reaction) when exposed to the angry face. The high- empathy group showed an increased corrugator activity (frowning) when exposed to the angry face while the low-empathy group diminished their corrugator activity when exposed to the angry face. V
4 3 2 1 0 -1 -2 -3 -4 -5 -6

Low-empathy subjects High-empathy subjects


zygomaticus corrugator

Figure 2. Medium level (35/45 ms-75 ms): Differences in mean muscle activity between exposure to happy and angry faces for high-empathy subjects and for low-empathy subjects. Zygomatic activity = positive emotions, Corrugator = negative emotions. Positive values indicate expected direction (mimicry reactions).

The results of an ANOVA (Faces x Muscles), all subjects included, did not reach significance at this level. An ANOVA (Faces x Muscles) in which high-empathy subjects were included reached significance, F (1, 26) = 5.70, p< 0.05. The high- empathy subjects smiled when exposed to happy faces and mimicking behavior. frowned when exposed to angry faces, thus showing a

Controlled level The last level, termed the controlled level of processing, included exposure times from 100 ms to 1000 ms. It can be considered as involving subjects functioning at a second memory level (Leventhal, 1984). The exposure time of 6000 ms was not included in this last processing level since it was found that subjects often expressed boredom and lost concentration during such a long exposure. An ANOVA (Faces x Muscles x Emotional empathy) for the entire group at this level of exposure level (100-1000 ms) resulted in F (1, 40) = 0.34, p = 0.57, and did not support an interaction between the variables Faces, Muscles and Emotional empathy (cf. Fig. 3). The figure shows differences in mean muscle activity between exposures to happy and angry faces

13

for high- and low-empathy subjects at the controlled level. Neither the entire group nor the high- or low-empathy groups showed any mimicking behavior at the controlled level of processing. As the results in Fig. 3 suggest the high-empathy subjects displayed a tendency to smile when exposed to angry faces in the same way as subjects in the low-empathy group.

V
6 4 2 0 -2 -4 -6 -8

High-empathy subjects Low-empathy subjects


Zygomaticus Corrugator

Fig. 3. Controlled level (100 -1000 ms): Differences in mean muscle activity between exposure to happy and angry faces for high-empathy subjects and for low-empathy subjects. Zygomatic activity = positive emotions, Corrugator = negative emotions. Positive values indicate expected direction (mimicry reactions).

Relationship between self-reported feelings and facial stimuli


The means of the self-reported feelings at different levels of responding when exposed to happy and to angry faces were calculated for each individual. A non-parametric Wilcoxon Signed Ranks Test, including all participants, was performed to compare the self-reported feelings when exposed to angry and happy faces. The result was highly significant (Z= 5.27,p < 0.001). Non-parametric test (Mann Whitney U-test) were performed to test differences between high and low-empathy groups in self-reported feelings when exposed to happy faces and also when exposed to angry faces. No significant differences between the high and lowempathy groups were found in either case.

14

Correspondence between self-reported feelings and muscle activity


A repeated-measure ANOVA (Faces x Muscles x Emotional empathy), all subjects included, with Self-reported feelings (3 levels) and Muscles (2 levels) as within-group factors and Emotional empathy (2 levels) as a between group factor, was performed. The results, F (2, 35) = 4.37, p < 0.05 (cf. Figs. 4a and 4b), indicated an interaction between Self-reported feelings, Muscles and Emotional empathy. Thirty-eight individuals of the sample of 42 were included in this analysis, since four subjects only used two levels (Neutral and Positive) in reporting their feelings. In further analyzing differences between the high- and low-empathy groups regarding the correspondence between Self-reported feeling and EMG-activity, a repeatedmeasure ANOVA (Self reported feelings x Emotional empathy), involving zygomaticus activity only, was found to be significant, F (2, 35) = 3.92, p<0.05. Subjects in the lowempathy group smiled more when they reported more negative feelings, whereas subjects in the high- empathy group showed a tendency to decrease zygomaticus activity (smile less) when reporting more negative feelings. In a repeated measure ANOVA (Self-reported feelings x Emotional empathy), in which only corrugator activity was included, no significant interaction effects were found between corrugator activity and reported feelings. Thus the differences between the empathy groups in the linkage between reported feelings and EMGactivity appears to mainly be a result of differences in the zygomaticus activity (smilingreactions).

21
26

24

20

22

19
20

18
18

16

17

MUSCLES
zygomatic

MUSCLES
14 zygomatic 12 negative feeling neutral feeling corrugator positive feeling

16 negative feelings neutral feelings

corrugator positive feelings

Figure 4 a. High-empathy group: Mean zygomaticus and mean corrugator activity at different self-reported feelings (3 categories), all stimuli and levels of exposures included.

Figure 4 b. Low-empathy group: Mean zygomaticus and mean corrugator activity at different self-reported feelings (3 categories), all stimuli and levels of exposures included.

15

The correspondence between Self-reported feelings and Muscle activity for the whole group (except 4 subjects) was analyzed in a repeated measure ANOVA (Self-reported feelings x Muscles) in which Self-reported feeling (3) and Muscles (2) were used as within-group factors. The result of this ANOVA, including all subjects (except 4), reached borderline significance, F (2, 36) = 2.95, p = 0.07. A linkage between Self-reported feelings and Muscle reactions in the expected direction was found for high-empathy subjects in an ANOVA (Selfreported feelings x Muscles) including only the high-empathy subjects, F (2, 24) = 7.84, p< 0.01, (cf. Fig. 4 a). An ANOVA of the same type for the low-empathy subjects did not reach significance (p = 0.72) (cf. Fig 4b).

DISCUSSION
Mimicking at different levels of processing
Preattentive level At the preattentive level, no interaction between Faces, Muscles and Emotional empathy was found, and no mimicking reactions were discernible for the entire group or for the highempathy subjects. Thus, no support for the hypothesis of mimicking starting already at preattentive level was obtained. The limitations in the number of subjects involved in the analysis at this level made it difficult to investigate mimicking adequately at the preattentive level. Empirical support for the existence of preattentive mimicking behavior was, in fact, recently obtained by Dimberg et al. (Dimberg et al., 2000).

Automatic and medium levels The automatic level (as well as the preattentive level) can be considered as corresponding to the second level in Leventhals (1984) and hmans models (1993) of different stages in the processing of emotional information, and can be assumed to operate at a first memory level. At both the automatic and medium levels, significant interaction effects were found between Faces x Muscles x Emotional empathy. This interaction effect was due mainly to the differences in the zygomaticus reactions between high- and low-empathy subjects. The fact that the high-empathy group showed a borderline significant mimicking reaction (p = 0.06) at the automatic level (17- 30/40 ms) and a significant mimicking reaction at the medium level (cf. Figs. 1 and 2) supported the hypothesis that automatic mimicry is an early, automatic element involved in emotional empathy. 16

It was demonstrated that in the low-empathy subjects inverted reaction tendencies of the zygomaticus muscles started already at short exposure levels, (cf. Figures 1 and 2), these subjects reacting by automatic smiling when exposed to an angry face. Could these unexpected reactions be regarded as automatic reactions based on implicit memory systems, which may serve the function of repressing negative affective reactions? As discussed above, several studies have indicated that afferent feedback from facial movements can modulate emotional experiences and may serve to enhance or inhibit emotions (Burgoon et al., 1996; Ekman et al., 1983; Hess et al., 1992; Lanzetta & Kleck, 1976; Levenson, 1996; McIntosh, 1996; Zajonc, 1985). A question that naturally follows is why such inverted zygomaticus reactions have developed despite human beings appearing to be biologically prepared for emotional communication via facial expressions (Buck, 1999; Ekman, Friesen, & Ellworth, 1972; Field, Woodson, & Greenberg, 1982; Izard, 1994; Melzofff & Moore, 1977 194; Sackett, 1966). One answer to this may be that social interactions modify expressions that are biologically prepared. Such modifications of facial expressions and emotional reactions are thought to be learned early in life, so that by adulthood the expressions modified in this way occur automatically, without conscious thought (Matsumuto & Lee, 1991). Emotions are, according to Stern (Stern, 1985), the primary medium and the primary subject of communication in early infancy. Infants are supposed to access the meaning of an emotional signal directly by emotional contagion (Hatfield et al., 1994; Stern, 1985). In modern attachment theory it is assumed that a parents frequent lack of empathetic affective reactions to certain affective states of the child tend to result in avoidance and conflict in the child with regard to the affective states involved (Ainsworth Salter & Bowlby, 1991; Brothers, 1989; Stern, 1985). In these terms, the low-empathy subjects counter-empathic smiling reactions may be explained as being unconscious/automatic emotional reactions of avoidance developed early in childhood and stored in implicit memory systems (Ainsworth Salter & Bowlby, 1991; Brothers, 1989; Stern, 1985).

Controlled level At the controlled level of processing, subjects were presumably able to interpret the stimulus and the situation in a controlled way. This is a level of processing that apparently corresponds to Leventhals or hmans third level of information processing, one involving use of a second memory system (explicit memory), which allows individuals to make a conscious, reflective record of the emotional situation (Leventhal, 1984; hman, 1993).

17

No interaction effect Faces x Muscles x Emotional empathy was found at this level of processing. The high-empathy subjects showed the same tendency to smile when exposed to an angry face as the low-empathy subjects (cf. Fig. 3). Could the high-empathy subjects increased zygomaticus activity when exposed to an angry face at this processing level be considered an intrapersonal coping reaction aiming at regulating their negative feelings automatically evoked earlier? Such affective regulation might take place both at an unconscious, automatic level, where it could represent defense mechanisms, and at a more conscious level, where it could represent affective regulation or emotional coping reactions. How the increased zygomaticus activity (smiling-reactions) upon exposure to an angry face at different levels of information processing should be interpreted is still an open question. Does it indicate a smiling reaction or simply an increase in muscle tension? What intra-individual affect-regulating or inter-individual communication functions could the inverted zygomaticus reactions that were found here accomplish? These questions cannot be answered on the basis of the present results. Further investigation is needed.

Mimicking for all subjects


No significant mimicking behavior was found at any information processing level when all subjects were included in the analyses. Mimicking as a universal behavior tendency has been found in earlier experiments (Dimberg, 1982; Dimberg, 1989; Dimberg, 1990; Dimberg & Karlsson, 1997; Dimberg et al., 2000). One tentative explanation of the differences between previous research findings and the results found in the present study could be that the experimental design used in the present experiment was different from those used earlier. The method with successive prolongation of the same stimulus creates different prerequisites than earlier methods, which have used exposures of different facial expressions, all exposures at the same relatively longer period of time.

Linkage between self-reported feelings and muscle activity


A significant interaction effect between Self-reported feelings, Muscles and Emotional empathy was found. The differences between the high- and the low-empathy group regarding correspondence between muscle reactions and self-reported feelings were expressed by differences between the zygomaticus muscle reactions. The low-empathy subjects showed smiling reactions when they reported negative feelings (cf. Fig. 4b), whereas the highempathy group was characterized by a linkage (p < 0.01) between reported feelings and 18

muscle activity, both for the zygomaticus and the corrugator muscles (cf. Fig. 4a). These results support the hypothesis of a higher degree of correspondence between facial muscle reactions and subjective experience in the high-empathy subjects. In terms of Baschs theory of the process which leads to an emotional empathy reaction, subjects who are high in empathetic ability should show an automatic mimicking reaction to the senders facial expression and a close correspondence between reported feelings and muscle reactions. Thus, the results provide support for automatic mimicry reactions being involved in emotional empathy. The possibility cannot be excluded, however, that a still earlier emotional reaction than the facial muscle reaction, may be involved in the correspondence between facial expressions and reported feelings. Further investigation is needed to answer the question of the mechanisms behind the connection between facial muscle activity and emotional experience in high-empathy subjects.

Relationship between self-reported feelings and facial stimuli


No differences were found between high- and low-empathy subjects in self-reported feelings either when looking at angry faces or when looking at happy faces. Subjects verbal report of feelings can be considered to be a conscious cognitive interpretation of the emotional internal state. Thus, at this conscious, verbally labeled level of subjective experience no differences between high- and low-empathy subjects were found.

Methodological limitations
The fact that the neutral stimulus was not used to obtain measurements of an emotional baselevel condition might be seen as a methodological limitation. A major reason for this decision was that the position of the stimulus was seen as possibly affecting the activity of the muscles, since the neutral stimulus was always in the second of the three positions. This made it impossible to use the neutral stimulus for measuring emotional base-level activity, the experimental design controlling only for positional effects concerning the happy and the angry faces. It is a definite limitation of the present methodology that the sampling rate (100 Hz) selected was too low to meet the recommendation of a sampling rate twice that of the most rapid EMG frequencies of interest (Fridlund & Cacioppo, 1986). This means that the signal was sub-

19

sampled and more reliable measurement data would probably have been obtained if the recommended sampling rate had been employed.

SUMMARY AND CONCLUSIONS


The hypotheses investigated were based on the conception of a process which leads to emotional empathy, a fundamental assumption of which is that biologically prewired automatic mimicking tendencies and emotional contagion are involved in an early, automatic part of the process. High-empathy subjects showed mimicking reactions at short exposure levels (automatic and medium, cf. Figs. 1 and 2) and reported feelings that were reflected in their muscle reactions and (cf. Fig. 4a). Thus, the result supported the hypothesis that automatic mimicry is an early, automatic element involved in emotional empathy. In contrast, the low-empathy group reacted with increased zygomaticus activity (smiling) when exposed to the angry face (cf. Figs. 1, 2, and 3) and showed a higher level of zygomaticus activity when they reported negative feelings (cf. Fig. 4b). A tentative interpretation of the low-empathy subjects inverted smiling reactions is that these reactions through facial feedback may serve defensive goals in inhibiting negative feelings. The high and low-empathy groups were not characterized by contrasts in self-reported feelings (a controlled, conscious process) when exposed to angry and happy faces or by differences in muscle reactions at the controlled level. Accordingly, individual differences in emotional empathy appeared to reflect differences in spontaneous somatic reactions based on primary memory systems rather than differences in controlled reactions to the emotional situation based on secondary memory systems. This provides support for the idea that the process involved in emotional empathy is related to automatic spontaneous reactions rather than its being a product of controlled cognitive interpretation of the emotional situation. Many researchers, (Ginsburg, 1997; Tassinary & Cacioppo, 1992), who suggest a multicomponent approach to the study of facial expressions, have questioned the tight linkage between facial expressions and emotions. Both the time dimension and individual differences in emotional regulation appear to be important aspects in the study of facial expressions. The use of process-oriented methods could be one way of distinguishing more spontaneous emotional reactions, in which a tight linkage between affects and facial expressions can be expected, from more controlled processes, in which conscious cognitive and contextual information can be assumed to be more strongly involved in facial expressions.

20

Acknowledgements
I wish to thank Prof. Ingegerd Carlsson, Prof. Jarl Risberg and Prof. Olof Rydn for their constructive criticism and their comments concerning the experimental design and during preparation of this article and Prof. Ulf Dimberg, Dr. Owe Svensson and M. Sc. JeanChristophe Rohner for technical advice and assistance, and Ph. Dr. Martin Bckstrm and Ph. Dr. Robert Goldsmith for statistical advice.

21

References
Ainsworth, M. D., & Bowlby, J. (1991). An ethological approach to personality development. American Psychologist, 46(4), 333-341. Basch, M. F. (1976). Psychoanalysis and communication science. The Annals of Psychoanalysis, 31, 101-126. Basch, M. F. (1983). Empathic understanding: A review of the concept and some theoretical considerations. Journal of the American the Psychoanalytical Association, 31, 101-126. Bavelas, J. B., Black, A., Lemery, C. R., & Mullett, J. (1986). "I show how you feel": Motor mimicry as a communicative act. Journal of Personality and Social Psychology, 50, 322-329. Brothers, L. (1989). A biological perspective on empathy. American Journal of Psychiatry, 146, 1019. Brown, J. W. (1985). Clinical evidence for the concept of levels of action and perception. Journal of Neurolinguistics, 1, 89-141. Brown, J. W. (1988). Life of the mind. Selected papers. Englewood Cliffs, NJ: Prentice-Hall. Burgoon, J. K., Buller, D. B., & Woodall, W. G. (1996). Nonverbal communication: The unspoken dialogue. New York: McGraw-Hill. Capella, J. N. (1993). The facial feed back hypotheses inhuman interaction: Review and speculation. Journal of Language and Social Psychology, 12, 13-29. Chartrand, T. L., & Bargh, J. A. (1999). The chameleon effect: The perception-behavior link and social interaction. Journal of Personality and Social Psychology, 76(6), 893-910. Choplan, B. E., McCain, M. L., Carbonell, J. L., & Hagen, R. L. (1985). Empathy: Review of available measures. Journal of Personality and Social Psychology, 48(3), 635-653. Darwin, C. (1872, 1965). The expression of emotion in man and in animals. Chicago: University of Chicago Press. Dimberg, U. (1982). Facial reactions to facial expressions. Psychophysiology, 19, 643-647. Dimberg, U. (1989). Facial expressions and emotional reactions: A psychobiological analysis of human social behaviour. In H. L. Wagner (Ed.), Social psychophysiology and emotion:Theory and clinical applications (Vol. 36, pp. 132-149). London: John Wiley & Sons Ltd. Dimberg, U. (1997a). Psychophysiological reactions to facial expressions. In U. Segerstrle & P. Molnar (Eds.), Nonverbal communication: Where nature meets culture (pp. 47-60). Mahwah, N J: Lawrence Erlbaum Associates, Inc. Dimberg, U. (1997b). Social fear and expressive reactions to social stimuli. Scandinavian Journal of Psychology, 38, 171-174. Dimberg, U., & Karlsson, B. (1997). Facial reactions to different emotionally relevant stimuli. Scandinavian Journal of Psychology, 38, 297-303.

22

Dimberg, U., Thunberg, M., & Elmehed, K. (2000). Unconscious facial reactions to emotional facial expressions. Psychological Science, 11 (1), 86-89. Dixon, N. F. (1981). Preconscious processing. New York: John Wiley & Sons. Ekman, P., & Friesen, W. (1978). Facial action coding system: A technique for the measurement of facial movement. Palo Alto, CA: Consulting Psychology Press. Ekman, P., & Friesen, W. V. (1975). Unmasking the face. Englewood Cliffs, New Jersey: PrenticeHall. Ekman, P., Levenson, R., & Friesen, W. V. (1983). Autonomic nervous system activity distinguishes among emotions. Science, 221, 1208-1210. Esteves, F., & hman, A. (1993). Masking the face: Recognition of emotional facial expressions as a function of the parameters of backward masking. Scandinavian Journal of Psychology, 34, 1-18. Eysenck, M. W., & Keane, M. T. (1995). Cognitive psychology. Hillsdale, USA: Lawrence Erlbaum Associates Publishers. Fridlund, A. J., & Cacioppo, J. T. (1986). Guidelines for human electromyographic research. Psychophysiology, 23(5), 567-589. Ginsburg, G. P. (1997). Faces: An epilogue and reconceptualization. In J. A. Russell & J. M. Fernndez-Dols (Eds.), The psychology of facial expression (pp. 349-382). New York: Cambridge University Press. Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1994). Emotional contagion. New York: Cambridge University Press. Hess, U., Banse, R., & Kappas, A. (1995). The intensity of facial expression is determined by underlying affect and the social situation. Journal of Personality and Social Psychology, 69, 280-288. Hess, U., Kappas, A., McHugo, G. J., Lanzetta, J. T., & Kleck, R. E. (1992). The faciliative effects of facial expressions on the self-generation of emotion. International Journal of Psychophsysiology, 12, 251-265. Hess, U., Philippot, P., & Blairy, S. (1998). Facial reactions to emotional facial expressions: Affect or cognition? Cognition and Emotion, 12(4), 509-531. Hjortsj, C. H. (1970). Man's face and mimic language. Malm: Nordens Boktryckeri. Hoffman, M. L. (1984). Interaction of affect and cognition on empathy. In C. E. Izard, J. Kagan, & R. B. Zajonc (Eds.), Emotions, Cognition and Behavior (pp. 101-131). New York: Cambridge University Press. Holm, U. (1985). Empati i lkar-patient relationen, en empirisk och teoretisk analys. Stockholm: Almqvist & Wicksell. Hsee, C. K., Hatfield, E., Carlsson, J., & Chetomb, C. (1990). The effect of power on susceptibility to emotional contagion. Cognition and Emotion, 4, 327-340.

23

Hsee, C. K., Hatfield, E., & Chemtob, C. (1992). Assesments of the emotional states of others: Conscious judgements versus emotional contagion. Journal of Social and Clinical Psychology, 11, 119-128. Izard, C. E. (1971). The face of emotion. New York: Appelton-Century-Crofts. Izard, C. E. (1990). Facial expressions and the regulation of emotions. Journal of Personality and Social Psychology, 58, 487-498. Kappas, A., Hess, U., & Banse, R. (1992). Skin conductance reactions to dynamic facial expressions revisited: Empathic responding or information processing? Psychophysiology, 29, 42. Kragh, U., & Smith, G. J. W. (1970). Percept-genetic analysis. Lund, Sweden: Gleerup. Laird, J. D. (1974). Self-attribution of emotion:The effects of expressive behavior on the quality of emotional experience. Journal of Personality and Social Psychology, 37, 475-486. Laird, J. D., Alibozak, T., Davainis, D., Deignan, K., Fontanella, K., Hong, J., Brett, L., & Pacheco, C. (1994). Individual differences in the effects of spontaneous mimicry on emotional contagion. Motivation and Emotion, 18, 231-245. Lanzetta, J. T., & Kleck, R. E. (1976). Effects of nonverbal dissimulation on emotional experience and autonomic arousal. Journal of Personality and Social Psychology, 33, 354-370. LeDoux, J. (1996). The emotional brain, The mysterious underpinnings of emotional life. New York: Simon & Schuster. Levenson, R. W. (1996). Biological substrates of empathy and facial modulation of emotion: Two facets of the scientific legacy of John Lanzetta. Motivation and Emotion, 20(3), 185-204. Levenson, R. W., & Ruef, A. M. (1992). Empathy: A physiological substrate. Journal of Personality and Social Psychology, 63(2), 234-246. Leventhal, H. (1984). A perceptual motor theory of emotion, Advances in experimental social psychology (Vol. 17, pp. 117-182). Madison, Wisconsin: Academic Press, Inc. Lundqvist, L.-O. (1995). Facial EMG reactions to facial expressions: A case of emotional contagion? Scandinavian Journal of Psychology, 36, 130-141. Matsumoto, D. (1987). The role of the facial response in experience of emotion: More methodological problems and meta-analysis. Journal of Personality and Social Psychology, 52, 769-774. Matsumoto, D., & Lee, M. (1991). Consciousness, volition and the neuropsychology of facial expression and emotion. Consciousness and Cognition, 2, 237-254. McHugo, G. J., & Smith, C. A. (1996). The power of faces: A review of John T. Lanzetta's research on facial expression and emotion. Motivation and Emotion, 20(2), 85-119. McIntosh, D. N. (1996). Facial feedback hypotheses: Evidence, implications, and directions. Motivation and Emotion, 20(2), 121-147. Pally, R. (1998). Emotional processing: The mind-body connection. Journal of Psychoanalysis, 79, 349-362.

24

Porges, S. W. (1991). Vagal tone: An automatic mediator of affect. In J. Garber & K. A. Dodge (Eds.), The development of emotion regulation and dysregulation (pp. 111-127). Cambridge: Cambridge University Press. Smith, G. J. W. (1991). Percept-genesis: A frame of reference for neuropsychological research. In R. E. Hanlon (Ed.), Cognitive microgenesis. A neuropsychological perspective . New York: Springer Verlag. Spielberger, C. D. (1983). Manual for the state-trait anxiety inventory: Self-evaluation questionnaire. Palo Alto, CA: Consulting Psychologist Press, Inc. Stern, D. N. (1985). The Interpersonal world of the infant: A view from psychoanalysis and developmental psychology. New York: Basic Books. Tassinary, L. G., & Cacioppo, J. T. (1992). Unobservable facial actions and emotion. Psychological Science, 3(1), 28-33. Tassinary, L. G., & Cacioppo, J. T. (2000). The skeletomotor system: Surface electromyography. In J. T. Cacioppo, L. G. Tassinary, & G. G. Berntsson (Eds.), Handbook of psychophysiology (pp. 163-199). Cambridge: Cambridge University Press. Tassinary, L. G., Scott, P. O., Wolford, G., Napps, S. E., & Lanzetta, J. T. (1984). The role of awareness in affective information processing: An exploration of the Zajonc hypothesis. Bullentin of Psychonomic Society, 22(6), 489-492. Tomkins, S. (1962). Affect, imagery and consciousness. Volume I: The positive affect. New York: Springer. Tomkins, S. (1984). Affect theory. In K. Scherer & P. Ekman (Eds.), Approaches to emotion (pp. 353400). Hillsdale, N. J.: Erlbaum. Tomkins, S. (1991). Affect, imagery and consciousness. New York: Springer Publishing Company. Vaughan, K. B., & Lanzetta, J. T. (1980). Vicarious instigation and conditioning of facial expressive automatic responses to a model's expressive display of pain. Journal of Personality and Social Psychology, 38, 909-923. Vrana, R. S., & Rollock, D. (1998). Physiological response to a minimal social encounter: Effects of gender, ethnicity and social context. Phychophysiology, 35, 462-469. Zajonc, R. B. (1985). Emotion and facial efference: A theory reclaimed. Science, 228, 15-21. Zajonc, R. B. (1980). Feeling and thinking. Preferences need no inferences. American Psychologist, 35, 151-175. Zajonc, R. B., Adelmann, K. A., Murphy, S. T., & Niedenthal, P. M. (1987). Convergence of the physical appearence of spouses. Motivation and Emotion, 11, 335-346. hman, A. (1993). Fear and anxiety as emotional phenomena: Clinical phenomenology, evolutionary perspectives and information processing mechanisms. In M. Lewis & J. M. Haviland (Eds.), Handbook of emotion (pp. 511-536). New York: The Guilford Press.

25

hman, A., & Dimberg, U. (1978). Facial expression as conditioned stimuli for electrodermal responses. A case of "preparedness"? Journal of Personality and Social Psychology, 36, 12511258.

26

You might also like