Brief CV:I studied Biology in Parma – Itlay – and started my scientific carrer in Rizzolatti's lab with an experimental master thesis
entitled: "The role of the somatosensory cortices during the observation of the tactile stimulation of others"
which contributed to a publication in Neuron. I then moved to Groningen – The Netherlands – where I completed
my PhD (cum laude) with a thesis entitled "Action in the brain: Shared neural circuits for action observation and execution"
(Dutch summaryEnglish summaryItalian summaryComplete Thesis) under the supervision of
Christian Keysers at the BCN-NeuroImaging Center.
Humans are social animals. While it is of cardinal importance for us to understanding what other people do and feel,
we still lack an understanding of how our brain achieves this function. Research on social perception has focused so
far on cognitive processes. I investigate an alternative account: ‘shared circuits’. Shared circuits are brain areas
involved when we ourselves do an action, feel an emotion or sense a sensation AND when we observe or listen to someone
else perform the same actions, express the same emotions and experience the same sensations. Such shared circuits reflect
an automatic transformation of what other people do and feel into the neural representation of our own actions, emotions
and sensations. Using fMRI we investigate the role of brain regions involved in the execution of actions during the perception
of the actions of others; the role of the somatosensory cortices during the perception of other people being touched;
and the role of emotional structures (e.g. amygdala and insula) during the observation of the emotional stimuli. The emphasis
of the work is to investigate the idea that a single mechanism – shared circuits – could give valuable insights into all three domains.
Gazzola V, van der Worp, H, Mulder T, Wicker B, Rizzolatti G, Keysers, C (2007)
Aplasics born without hands mirror the goal of hand actions with their feet. Current Biology 17(14):1235–40
The premotor and parietal mirror neuron system (MNS) is thought to contribute to the understanding of observed
actions by mapping them onto "corresponding" motor programs of the observer, but how would the MNS respond
to the observation of hand actions if the observer never had hands?
Would it not show changes of blood–oxygen–level dependent (BOLD) signal, because the observer lacks motor programs
that can resonate, or would it show significant changes because the observer has motor programs for the foot or mouth
with corresponding goals? We scanned two aplasic subjects, born without arms or hands, while they
watched hand actions and compared their brain activity with that of 16 control subjects. All subjects additionally
executed actions with different effectors (feet, mouth, and, for controls, hands). The BOLD signal of aplasic individuals
within the putative MNS was augmented when they watched hand actions, demonstrating the brain's capacity to mirror actions
that deviate from the embodiment of the observer by recruiting voxels involved in the execution of actions that achieve
corresponding goals by different effectors. This sheds light on the functional organization of the MNS and predominance
of goals in imitation.
Gazzola V, Rizzolatti G, Wicker B, Keysers C (2007) The anthropomorphic brain: the mirror neuron system responds to
human and robotic actions. Neuroimage 35(4):1674–84.
In humans and monkeys the mirror neuron system transforms seen actions into our inner representation of these actions.
Here we asked if this system responds also if we see an industrial robot perform similar actions. We localised the motor
areas involved in the execution of hand actions, presented the same subjects blocks of movies of humans or robots perform a
variety of actions. The mirror system was activated strongly by the sight of both human and robotic actions, with no significant
differences between these two agents. Finally we observed that seeing a robot perform a single action repeatedly within a block
failed to activate the mirror system. This latter finding suggests that previous studies may have failed to find mirror
activations to robotic actions because of the repetitiveness of the presented actions. Our findings suggest that the mirror
neuron system could contribute to the understanding of a wider range of actions than previously assumed, and that the goal of
an action might be more important for mirror activations than the way in which the action is performed.
Gazzola V, Aziz-Zadeh L, Keysers C (2006) Empathy and the somatotopic auditory mirror system in humans. Current biology 16(18):1824–9
How do we understand the actions of other individuals if we can only hear them? Auditory mirror neurons respond both while monkeys perform hand or mouth actions
and while they listen to sounds of similar actions . This system might be critical for auditory action understanding and language evolution . Preliminary evidence suggests
that a similar system may exist in humans . Using fMRI, we searched for brain areas that respond both during motor execution and when individuals listened to the sound of
an action made by the same effector. We show that a left hemispheric temporo–parieto–premotor circuit is activated in both cases, providing evidence for a
human auditory mirror system. In the left premotor cortex, a somatotopic pattern of activation was also observed: A dorsal cluster was more involved during listening and
execution of hand actions, and a ventral cluster was more involved during listening and execution of mouth actions. Most of this system appears to be multimodal because
it also responds to the sight of similar actions. Finally, individuals who scored higher on an empathy scale activated this system more strongly, adding evidence for a possible
link between the motor mirror system and empathy.
Keysers C and Gazzola V (2006) Towards a unifying neural theory of social cognition. Prog Brain Res. 156:379–401
Humans can effortlessly understand a lot of what is going on in other peoples' minds. Understanding the neural basis of this capacity has proven quite difficult.
Since the discovery of mirror neurons, a number of successful experiments have approached the question of how we understand the actions of others from the
perspective of sharing their actions. Recently we have demonstrated that a similar logic may apply to understanding the emotions and sensations of others.
Here, we therefore review evidence that a single mechanism (shared circuits) applies to actions, sensations and emotions: witnessing the actions, sensations
and emotions of other individuals activates brain areas normally involved in performing the same actions and feeling the same sensations and emotions. We propose
that these circuits, shared between the first (I do, I feel) and third person perspective (seeing her do, seeing her feel) translate the vision and sound of what other people
do and feel into the language of the observers own actions and feelings. This translation could help understand the actions and feelings of others by providing intuitive
insights into their inner life. We propose a mechanism for the development of shared circuits on the basis of Hebbian learning, and underline that shared circuits could
integrate with more cognitive functions during social cognitions.
Keysers C., Wicker B., Gazzola V., Fogassi L., and Gallese V. (2004) A touching sight: SII/PV activation during the observation and experience of touch. Neuron 42:335–346.
Watching the movie scene in which a tarantula crawls on James Bond's chest can make us literally shiver – as if the spider crawled on our own chest. What neural mechanisms are
responsible for this "tactile empathy"? The observation of the actions of others activates the premotor cortex normally involved in the execution of the same actions. If a similar
mechanism applies to the sight of touch, movies depicting touch should automatically activate the somatosensory cortex of the observer. Here we found using fMRI that the secondary
but not the primary somatosensory cortex is activated both when the participants were touched and when they observed someone or something else getting touched by objects.
The neural mechanisms enabling our own sensation of touch may therefore be a window also to our understanding of touch.