ERP: researched. We’ve all heard the expression “the eyes

ERP: N170 and its Role in Visual Facial
Processing

Within the field of perception research, one
of the most popular topics of conversation is the study of the human face and
our perception of it. When we consider how vital it is to understand how we
perceive human faces, it makes sense that it is so widely researched. We’ve all
heard the expression “the eyes are the window to the soul”. Maybe this is true,
only because a person’s eyes might tell you something about what they’re
thinking or feeling, but even more accurate information can be gathered from
the rest of a person’s face. In less than a few seconds of gazing at another
person’s face, you can identify who they are, possibly their sex, line of
attention, and even their mood (Dering, Martin, Moro, Pegna, & Thierry,
2011). There have been numerous studies done involving facial perception, but
the studies that this paper will be looking deeper into are those dealing with
the event-related potential (ERP) N170, first described by the research of
Shlomo Bentin in 1996. The N170 is a neural marker for facial recognition,
measured by using an electroencephalograph (EEG). It has a negative peak
waveform that occurs around 170ms after the facial stimulus is presented. On
the EEG, the response is most prominent in the occipito-temporal region of the
brain, more so in the right hemisphere than the left, showing some lateralization
(Botzel, Schulze, & Stodieck, 1995). According to Cao, Li, Gaspar, and
Jiang, the N170 may also be used as an indicator of base-level expertise of
pattern perception, and that the N170 not only shows selectivity to faces but
to other different visual patterns as well (Cao, Li, Gaspar, & Jiang,
2013). Other topics that this paper will touch on include the N170 in response
to emotionally expressive stimuli, task difficulty, and faces versus non-face
stimuli.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

In a
study performed by Botzel, Schulze, & Stodieck in 1995, before the N170 had
been formally described, subjects were exposed to black and white photos of
different stimuli (Botzel et al., 1995). These images included photos of human
faces, flowers, and leaves. Subjects were monitored by EEG, where the now known
N170 showed a negative peak at 175ms for the human face stimuli but not for the
other stimuli (Botzel et al., 1995). This study also found that the occipital
and lateral temporal areas of the brain were active during the recorded ERPs
(Botzel et al., 1995). This gave a baseline idea of where the N170 component is
generated, and that there were specific brain structures responsible for face
recognition, but not general objects.

            There is
a plethora of interesting research that involves the N170 ERP component, one
topic including how the N170 responds to emotional facial expressions. Some research
suggests that the N170 is not only sensitive to the basic facial structure, but
also to the presumed emotional expression on the face (Blau, Maurer, Tottenham
& McCandliss, 2007). In a study by Blau et al., subjects were monitored by
EEG and their response to both fearful and neutral faces were recorded. Their
study drew upon the finding that the N170 is sensitive to images of eyes
isolated from the rest of the face, suggesting that it is sensitive to more
than just the visual structure of a face (Blau et al., 2007). The results of
this study concluded that the N170 response to facial stimuli was modified according
to the emotional expression of the face, and that this may indicate the
time-course of emotional processing (Blau et al., 2007).  

A study
done by Dering, Martin, Moro, Pegna, and Thierry in 2011 also supports the finding
that the N170 responds to more than just the facial structure (Dering et al., 2011).

Their study used different visual stimuli to measure face processing. They
performed three different experiments, all of which contained varying levels of
visual stimuli containing both faces and cars. The results of these three
experiments state that the N170 ERP component is able to process perceptual aspects
other than categorization, such as familiarity of, identity, ethnicity, and
emotional expression of a particular face or facial stimuli (Dering et al., 2011).

These findings are important because it gives weight to the idea that the N170
does more than just process the basic visual structure of a face.

In response
to the work of Dering et al., 2011, Eimer wrote a commentary that deals with
the face-selectivity of the N170 ERP component (Eimer, 2011). In this
commentary, Eimer states that images cropped to remove internal or external
features may result in greater amplitudes of N170. Eimer refers to this as a
“cropping effect” (Eimer, 2011). A notable finding is that the same effect occurs
when exposed to facial stimuli as when exposed to other stimuli, in this case
cars and butterflies. Eimer suggests that this may lead to misinterpreted
evidence of face-sensitivity of the N170. In this article Eimer stresses that
elementary visual features of facial stimuli must be carefully controlled since
they are essential in order to link the N170 to face-sensitive brain mechanisms
(Eimer, 2011).

There is
a study from 2013 by Cao, Gaspar, Li, and Jiang that states that both faces and
words may evoke the N170 ERP component (Cao et al, 2013). In their experiment,
the link between faces and words is tested by using a cross-category modification
to the N170. A significant irregularity was found between the N170 adaptation provoked
by faces and by words. This is the first instance of EEG results showing that
neural discrimination of faces involves neural selectivity of words (Cao et al,
2013). The results of this study suggest that the N170 response to faces forms
a neural indicator for fluctuating representations of familiar photographic stimuli
(Cao et al., 2011).

According
to the research, in terms of visual perception, the N170 is a part of an
expertise system that starts at the base level for object recognition. It is
not limited strictly to the surface features of the face, as referenced by Cao
et al., 2013 and Dering et al., 2011. The N170 shows differential activation/activation
times for emotional recognition, and that there is evidence of more abstract
moments of facial recognition, including gender, ethnicity, emotional
expression, and more. There are still many opportunities for expansion on this
topic. Further research might take a look at the overlap with cognitive science
in an expertise theory model. It would be interesting to see if there are any
differences in the activation of N170 in infants, as they most likely have not
developed enough expertise as compared to an adult. Further, the
differentiation between non-face objects in varying situations might be examined
as well, to see what else the N170 ERP component might be responsible for
processing.