The Neurobiology of Semantic Memory
The centrality of semantic memory in human behavior
Human brains acquire and use concepts with such apparent ease that the neurobiology of this complex process seems almost to have been taken for granted. Although philosophers have puzzled for centuries over the nature of concepts [1], semantic memory (see Glossary) became a topic of formal study in cognitive science only relatively recently [2]. This history is remarkable, given that semantic memory is one of our most defining human traits, encompassing all the declarative knowledge we acquire about the world. A short list of examples includes the names and physical attributes of all objects, the origin and history of objects, the names and attributes of actions, all abstract concepts and their names, knowledge of how people behave and why, opinions and beliefs, knowledge of historical events, knowledge of causes and effects, associations between concepts, categories and their bases, and on and on.
Also remarkable is the variety of everyday cognitive activities that depend on this extensive store of knowledge. A common example is the recognition and use of objects, which has been the focus of much theoretical and empirical work on semantic memory [3–7]. Recognition and use of objects, however, is a capacity shared by many non-human animals that interact with food sources, build simple structures, or use simple tools. More uniquely human is the ability to represent concepts in the form of language, which allows not only the spread of conceptual knowledge in an abstract symbolic form, but also a cognitive mechanism for the fluid and flexible manipulation, association, and combination of concepts [8, 9]. Thus humans use conceptual knowledge for much more than merely interacting with objects. All of human culture, including science, literature, social institutions, religion, and art, is constructed from conceptual knowledge. We do not reason, plan the future or remember the past without conceptual content – all of these activities depend on activation of concepts stored in semantic memory.
Scientific study of human semantic memory processes has been limited in the past both by a relatively restricted focus on object knowledge and by an experimental tradition emphasizing stimulus-driven brain activity. Human brains are occupied much of the day with reasoning, planning, and remembering. This highly conceptual activity need not be triggered by stimuli in the immediate environment – all of it can be done, and usually is, in the privacy of one's own mind. Together with recent insights gained from studies of patients with semantic memory loss, functional imaging data are rapidly converging on a new anatomical model of the brain systems involved in these processes. Given the centrality of semantic memory to human behavior and human culture, the significance of these discoveries can hardly be overstated.
In this article we propose a large-scale neural model of semantic processing that synthesizes multiple lines of empirical and theoretical work. Our core argument is that semantic memory consists of both modality-specific and supramodal representations, the latter supported by the gradual convergence of information throughout large regions of temporal and inferior parietal association cortex. These supramodal convergences support a variety of conceptual functions including object recognition, social cognition, language and the uniquely human capacity to construct mental simulations of the past and future.
Central issues in semantic processing
A major issue in the study of semantic memory concerns the nature of concept representations. Efforts in the last century to develop artificial intelligence focused on knowledge representation in the form of abstract symbols [10]. This approach led to powerful new techniques for information representation and manipulation (e.g., semantic nets, feature lists, ontologies, schemata). Recent advances in this area used machine learning techniques together with massive verbal inputs to create a highly flexible, probabilistic symbolic network that can respond to general questions in a natural language format [11]. Scientists interested in human brains, on the other hand, have long assumed that the brain represents concepts at least partly in the form of sensory and motor experiences. Nineteenth-century neurologists, for example, pictured a widely distributed `concept field' in the brain where visual, auditory, tactile, and motor `images' associated with a concept were activated in the process of word comprehension [12, 13]. A major advantage of such a theory over a purely symbolic representation is that it provides a parsimonious and biologically plausible mechanism for conceptual learning. Over the course of many similar experiences with entities from the same category, an idealized sensory or motor representation of the entity develops by generalization across unique exemplars, and reactivation or `simulation' of these modality-specific representations forms the basis of concept retrieval [14].
In addition to these issues concerning representation of information, questions arise about the mechanisms that control semantic information retrieval. Clearly not all knowledge associated with a concept is relevant in all contexts, thus mechanisms are needed for selecting or attending to task-relevant information [15, 16]. Some conceptual tasks also place strong demands on creativity, a term we use here to refer to flexible problem solving in the absence of strong constraining cues. Creative invention through technological innovation, art, and `brainstorming' are uniquely human endeavors that require fluent conceptual retrieval and flexible association of ideas. Even everyday conversation requires a logical but relatively unconstrained flow of ideas, in which one topic leads to another through a series of associated concepts. This type of flexible association and combining of concepts, though ubiquitous in everyday life, has largely been overlooked in functional imaging studies, which tend to focus on highly constrained retrieval tasks involving recognition of words and objects.
Evidence for modality-specific simulation in comprehension
The idea that sensory and motor experiences form the basis of conceptual knowledge has a long history in philosophy, psychology, and neuroscience [1, 3, 12, 13]. In recent years, this proposal has gained new steam under the rubric of `embodied' or `situated' cognition, supported by numerous neuroimaging and behavioral studies. Some of the imaging studies showing modality-specific activations during language processing are summarized in Figure 1. A number of these address action concepts and show that processing action-related language activates brain regions involved in executing and planning actions. Motion, sound, color, olfaction, and gustatory concept processing have also been addressed, and also tend to show activation in or near regions that process these perceptual modalities (see legend, Figure 1).
This figure displays sites of peak activation from 38 imaging studies that examined modality-specific knowledge processing during language comprehension tasks. Peaks were mapped to a common spatial coordinate system and then to a representative brain surface. Action knowledge peaks (red) cluster in primary and secondary sensorimotor regions in the posterior frontal and anterior parietal lobes. Motion peaks (green) cluster in posterior inferolateral temporal regions near the visual motion processing pathway. Note that motion concepts, especially when elicited by action verbs, are difficult to distinguish from action concepts. Peaks near motion processing area MT/MST in four of the studies of action language are interpreted here as reflecting motion knowledge. Auditory peaks (yellow) occur in superior temporal and temporoparietal regions adjacent to auditory association cortex. Color peaks (blue) cluster in the fusiform gyrus just anterior to color-selective regions of extrastriate visual cortex. Olfactory peaks (pink) observed in one study were in olfactory areas (prepiriform cortex and amygdala). Gustatory peaks (orange) were observed in one study in the anterior orbital frontal cortex. Emotion peaks (purple) involve primarily anterior temporal, medial and orbital prefrontal, and posterior cingulate regions. Details regarding study selection and a list of the included studies are provided in supplementary material online.
Challenges to the embodiment view have also arisen. One objection is that activations observed in imaging experiments could be epiphenomenal and not causally related to comprehension [17]. This hypothesis has been tested in patients with various forms of motor system damage. Initial results indicate a selective impairment for comprehending action verbs in patients with Parkinson's disease [18], progressive supranuclear palsy [19], apraxia [20], and motor neuron disease [21, 22]. Several studies employing transcranial magnetic stimulation to induce transient lesions in the primary motor cortex or inferior parietal lobe provide converging results [23–28]. Thus, involvement of the motor system during action word processing contributes to comprehension and is not a mere by-product. A related argument is that the activations represent post-comprehension imagery. In studies using imaging methods with high temporal resolution, however, the activation of motor regions during action word processing appear to be rapid, approximately 150–200 ms from word onset [29–32], suggesting that it is part of early semantic access rather than a result of post-comprehension processes. These converging results provide compelling evidence that sensory-motor cortices play an essential role in conceptual representation.
Although it is often overlooked in reviews of embodied cognition, emotion is as much a modality of experience as sensory and motor processing [33]. Words and concepts vary in the magnitude and specific type of emotional response they evoke, and these emotional responses are a large part of the meaning of many concepts. Purple dots in Figure 1 represent activation peaks from 14 imaging studies that examined activation as a function of the emotional content of words or phrases. There is a clear preponderance of activations in the temporal pole (13 studies) and ventromedial prefrontal cortex (10 studies), both of which play a central role in emotion [34, 35]. Involvement of the temporal pole in high-level representation of emotion may also explain activation in this region associated with social concepts [36, 37], which tend to have strong emotional valence.
Evidence for high level convergence zones
In addition to modality-specific simulations, we propose that the brain uses abstract, supramodal representations during conceptual tasks. One compelling argument for this view is that the human brain possesses large areas of cortex that are situated between modal sensory-motor systems and thus appear to function as information `convergence zones' [14]. These heteromodal areas include the inferior parietal cortex (angular and supramarginal gyri), large parts of the middle and inferior temporal gyri, and anterior portions of the fusiform gyrus [38]. These areas have expanded disproportionately in the human brain relative to the monkey brain, `taking over' much of the temporal lobe from the visual system [39]. Advocates of a strictly embodied theory of conceptual processing have largely ignored these brain regions, yet they occupy a substantial proportion of the posterior cortex in humans.
A second body of evidence comes from patients with damage in the inferior and lateral temporal lobe, particularly patients with semantic dementia, a syndrome characterized by progressive temporal lobe atrophy and multimodal loss of semantic memory [40, 41]. These patients are unable to retrieve names of objects, categorize objects or judge their relative similarity, identify the correct color or sound of objects, or retrieve knowledge about actions associated with objects [42–45]. Critically, the deficits do not appear to be category-specific [46] – further evidence that the semantic impairment does not involve strongly modal representations.
A third large body of evidence comes from functional imaging studies that target general semantic rather than modality-specific semantic processes. For example, many imaging experiments have contrasted words against pseudowords, related against unrelated word pairs, meaningful against nonsensical sentences, and sentences against random word strings. In another type of general semantic contrast, a semantic task (e.g., a semantic decision) is contrasted with a phonological control task (e.g., rhyme decision). What is important to understand about all of these `general' contrasts is that although they elicit differences in the degree of access to semantic information, they include no manipulation of modality-specific information. In the absence of systematic biases affecting stimulus selection, the activations resulting from these contrasts are unlikely to reflect modality-specific representations.
A quantitative meta-analysis of 120 of these studies was recently performed [47]. Studies were included only if they satisfied strict criteria for a semantic contrast. Studies were excluded if the stimuli in the contrasting conditions were not matched on orthographic or phonological properties, or if the activations could be explained by differences in attention or working memory demands. Reliability of the activation sites reported in the studies was analyzed using a volume-based technique called activation likelihood estimation [48].
The results showed remarkable consistency across studies, with reliable activation throughout the left temporal and parietal heteromodal cortex (Figure 2). These locations are consistent with the location of pathological changes in semantic dementia, as well as with temporal and parietal vascular lesions causing semantic impairments [49–53]. Other consistent sites of activation included the dorsomedial prefrontal cortex (superior frontal gyrus), ventromedial prefrontal cortex, inferior frontal gyrus, and the posterior cingulate gyrus and precuneus. The results offer compelling evidence for high-level convergence zones in the inferior parietal, lateral temporal, and ventral temporal cortex. These regions are far from primary sensory and motor cortices and appear to be involved in processing general rather than modality-specific semantic information.
This figure displays brain regions reliably activated by general semantic processes, based on reported activation peaks from 120 independent functional imaging studies (p <.05 corrected for family-wise error). The analysis method assigns a significance value to the degree of spatial overlap between the reported activation coordinates in a standard volume space. The figure shows selected sagittal sections in the left hemisphere; right hemisphere activations occurred in similar locations but were less extensive. AG = angular gyrus, FG = fusiform gyrus, IFG = inferior frontal gyrus, MTG = middle temporal gyrus, PC = posterior cingulate gyrus, SFG = superior frontal gyrus, SMG = supramarginal gyrus, VMPFC = ventromedial prefrontal cortex. Green lines indicate the Y and Z axes in standard space. Adapted from [47].
Embodied abstraction in conceptual representation
Figure 3 illustrates several prominent theories that differ in the proposed level of separation between conceptual and perceptual representations. Models based on disembodied, symbolic conceptual representations [9, 10] are often criticized on the grounds that such symbols are ultimately devoid of content [54]. From an empirical standpoint, the extensive evidence for involvement of modality-specific sensory, action, and emotion systems during language comprehension is also inconsistent with such a model.
Theories of perception and cognition vary in terms of the degree of separation between these processes. Disembodied models propose a complete separation, in which conceptual processing is based entirely on amodal, symbolic representations [9, 10, 17]. Other theories propose that conceptual and perceptual representations are distinct and separate but interact closely so that amodal symbols can derive content from perceptual knowledge [7, 14]. In contrast to both of these theories, strong embodiment models posit that perceptual and conceptual processes are carried out by a single system [55, 56]. In contrast to all of these theories, the neuroanatomical evidence for multiple modality-specific systems gradually converging on a common semantic network suggests a process of `embodied abstraction,' in which conceptual representation is embodied in multiple levels of abstraction from sensory, motor and affective input. The extent to which modality-specific perceptual representations are activated during semantic tasks varies with concept familiarity, demand for perceptual information and degree of contextual support (see Box 1).
At the other end of the spectrum are `strong embodiment' models in which perceptual and conceptual processes are carried out by the same (perceptual) system [55, 56]. These models are inconsistent with the evidence for modality-independent semantic networks reviewed above. Furthermore, conceptual deficits in patients with sensory-motor impairments, when present, tend to be subtle rather than catastrophic. In a recent study of aphasic patients [57], lesions in both sensory-motor and temporal regions were correlated with impairment in a picture-word matching task involving action words. This evidence is incompatible with a strong version of the embodiment account, in which sensory-motor regions are necessary and sufficient for conceptual representation.
Other theories propose that amodal representations derive their content from close interactions with modal perceptual systems [7, 14]. The purpose of amodal representations in these latter models is to bind and efficiently access information across modalities rather than to represent the information itself [58]. The need for distinct amodal representations in such a model has been sharply questioned, however, as multimodal perceptual representations could fulfill the same role [55, 56].
We suggest that the current evidence is most compatible with a view we term `embodied abstraction,' briefly sketched here (see [59, 60] for similar proposals). In this view, conceptual representation consists of multiple levels of abstraction from sensory, motor, and affective input. All levels are not automatically accessed or activated under all conditions. Rather, this access is subject to factors such as context, frequency, familiarity, and task demands. The top level contains schematic representations that are highly abstracted from detailed representations in the primary perceptual-motor systems. These representations are `fleshed out' to varying degrees by sensory-motor-affective contributions in accordance with task demands. In highly familiar contexts, the schematic representations are sufficient for adequate and rapid processing. In novel contexts or when the task requires deeper processing, sensory-motor-affective systems make a greater contribution in fleshing out the representations (Box 1).
A neuroanatomical model of semantic processing
Figure 4 outlines a neuroanatomical model of semantic memory consistent with a broad range of available data. Modality-specific representations (yellow areas in Figure 4), located near corresponding sensory, motor, and emotion networks, develop as a result of experience with entities and events in the external and internal environment. These representations code recurring spatial and temporal configurations of lower-level modal representations. Although depicted as somewhat modular, we view these systems as an interactive continuum of hierarchically ordered neural ensembles, supporting progressively more combinatorial and idealized representations. These systems correspond to Damasio's local convergence zones [14] and to Barsalou's unimodal perceptual symbol systems [55]. In addition to bottom-up input in their associated modality, they receive a range of top-down input from other modal systems and from attention. They are modal in the sense that the information they represent is an analog of (i.e., isomorphic with) their bottom-up input [55].
A model of semantic processing in the human brain is shown, based on a broad range of pathological and functional neuroimaging data. Modality-specific sensory, action, and emotion systems (yellow regions) provide experiential input to high-level temporal and inferior parietal convergence zones (red regions) that store increasingly abstract representations of entity and event knowledge. Dorsomedial and inferior prefrontal cortices (blue regions) control the goal-directed activation and selection of the information stored in temporoparietal cortices. The posterior cingulate gyrus and adjacent precuneus (green region) may function as an interface between the semantic network and the hippocampal memory system, helping to encode meaningful events into episodic memory. A similar, somewhat less extensive semantic network exists in the right hemisphere, although the functional and anatomical differences between left and right brain semantic systems are still unclear.
These modal convergence zones then converge with each other in higher-level cortices located in the inferior parietal lobe and much of the ventral and lateral temporal lobe (red areas in Figure 4). One function of these high-level convergences is to bind representations from two or more modalities, such as the sound and visual appearance of an animal, or the visual representation and action knowledge associated with a hand tool [7, 12, 14, 55]. Such supramodal representations capture similarity structures that define categories, such as the collection of attributes that place `pear' and `light bulb' in different categories despite a superficial similarity of appearance, and `pear' and `pineapple' in the same category despite very different appearances [58]. More generally, supramodal representations allow the efficient manipulation of abstract, schematic conceptual knowledge that characterizes natural language, social cognition, and other forms of highly creative thinking [59, 60].
These modal and supramodal convergence zones store the actual content of semantic knowledge, whereas the prefrontal regions colored blue in Figure 4 control top-down activation and selection of the content in posterior stores (Box 2). The posterior cingulate gyrus and adjacent precuneus (green area in Figure 4) consistently show semantic effects in imaging experiments and have also been implicated in a wide variety of other processes, as discussed below. Given the strong reciprocal connections this region has with the hippocampal formation, it likely plays a role in encoding semantically and emotionally meaningful events in episodic memory [61], though its precise function remains a topic for future research.
Our view of semantic processing in posterior cortical regions is similar to the `hub and spoke' model of Patterson, Rogers, Lambon Ralph, and colleagues [7, 46, 58] and to the convergence zone model of Damasio [14], but differs in two important respects. First, we do not believe the data support a central role for the temporal pole as the highest level in the convergence zone hierarchy (Box 3). As shown in Figures 2 and and4,4, multimodal convergence of information processing streams occurs throughout much of the lateral and ventral temporal cortex, as well as in the inferior parietal lobe, whereas the temporal pole receives strong affective input from the ventral frontal lobe and amygdala and is better characterized as a modal region for processing emotion and social concepts [34, 36, 37]. Second, proponents of the hub and spoke model explicitly deny a role for the inferior parietal lobe in representation of semantic information [62]. We believe that the anatomical and functional imaging evidence for semantic memory storage in the inferior parietal lobe is difficult to deny, even though the nature of the information represented in this region is still unclear (Box 4).
Social cognition, declarative memory retrieval, prospection, and the default mode
The network of brain regions we associate here with semantic processing has also been linked with more specific functions. Nearly all parts of the network have been implicated in aspects of social cognition, including theory-of-mind (processing of knowledge pertaining to mental states of other people), emotion processing, and knowledge of social concepts [36, 37, 63–67]. Much of the network has been implicated in retrieval of episodic and particularly autobiographical memories [68, 69], leading to the hypothesis that these regions function to retrieve event memories through a process of `scene construction' [70]. The same scene construction processes have been proposed as the basis for `prospection,' i.e., imagining future scenarios for the purpose of planning and goal attainment [71, 72]. Finally, the association of these regions with autobiographical, `self-projection,' and self-referential processes has led to suggestions that they are specifically involved in processing self knowledge [73, 74]. Several recent reviews and meta-analyses attest to the high degree of neuroanatomical overlap between the networks supporting these purportedly distinct processes [67, 75–77].
Given this overlap, it is logical to ask whether there is a process common to all of these cognitive functions. A model based on self-referential processing cannot easily explain activation of the same regions by theory-of-mind tasks, which by definition emphasize knowledge pertaining to others. The general process of mental scene construction is common to episodic memory retrieval, prospection, and many theory-of-mind tasks, but this model cannot explain the consistent activation of these regions by single word comprehension tasks, as shown above in Figure 2. Indeed, the contrasts analyzed in Figure 2 focused on general semantic knowledge (especially knowledge about object concepts) and did not emphasize episodic, autobiographical, social, emotional, self, or any other specific knowledge domain.
One process shared across semantic, social cognition, episodic memory, scene construction, and self-knowledge tasks is the retrieval of conceptual knowledge. The scene construction posited to underlie episodic memory retrieval and prospection refers to a partial, internal simulation of prior experience. But the construction of a scene requires content. The content of such a simulation is conceptual knowledge about particular entities, events, and relationships. The variety of this content is impressive, encompassing object, action, social, self, spatial, and other domains, yet these types of content all share a common basis in sensory-motor experience, learning through generalization across individual exemplars, and progressive abstraction from perceptual detail. We propose that the essential function of the high-level convergence zone network is to store and retrieve this conceptual content, which is employed over a variety of domain-specific tasks.
This network of high-level convergence zones also overlaps extensively with the `default mode network' of regions that show higher levels of activity during passive and `resting' states than during attentional tasks [47, 74–76, 78]. The similarity between all of these networks lends strong support to proposals that `resting' is a cognitively complex condition characterized by episodic and autobiographical memory retrieval, manipulation of semantic and social knowledge, creativity, problem solving, prospection, and planning [75, 78–81]. Several authors have emphasized the profound adaptive value of these processes, which not only enable the attainment of personal goals but are also responsible for all of human cultural and technological development [78, 80, 81].
Concluding remarks
This review proposes a large-scale brain model of semantic memory organization in the human brain based on synthesis of a large body of empirical imaging data with a modified embodiment theory of knowledge representation. In contrast to strong versions of embodiment theory, the data show that large areas of heteromodal cortex participate in semantic memory processes. The multimodal convergence of information toward these brain areas enables progressive abstraction of conceptual knowledge from perceptual experience, enabling rapid and flexible manipulation of this knowledge for language and other highly creative tasks. In contrast to models that identify the temporal pole as the principal site of this information convergence, the evidence suggests involvement of heteromodal regions throughout the temporal and the inferior parietal lobes. We hope this anatomical-functional model provides a useful framework for several future lines of research (Box 5).
Supplementary Material
01
01
Acknowledgements
Supported by NIH grants R01 NS33576 and R01 DC010783. Thanks to Lisa Conant, Will Graves, Colin Humphries, Tim Rogers, and Mark Seidenberg for helpful discussions.
Abstract
Semantic memory includes all acquired knowledge about the world and is the basis for nearly all human activity, yet its neurobiological foundation is only now becoming clear. Recent neuroimaging studies demonstrate two striking results: the participation of modality-specific sensory, motor, and emotion systems in language comprehension, and the existence of large brain regions that participate in comprehension tasks but are not modality-specific. These latter regions, which include the inferior parietal lobe and much of the temporal lobe, lie at convergences of multiple perceptual processing streams. These convergences enable increasingly abstract, supramodal representations of perceptual experience that support a variety of conceptual functions including object recognition, social cognition, language, and the remarkable human capacity to remember the past and imagine the future.
Glossary
| Embodied cognition | in cognitive neuroscience, the general theory that perceptual and motor systems support conceptual knowledge, that is, that understanding or retrieving a concept involves some degree of sensory or motor simulation of the concept. A related term, situated cognition, refers to a more general perspective that emphasizes a central role of perception and action in cognition, rather than memory and memory retrieval. |
| Heteromodal cortex | cortex that receives highly processed, multimodal input not dominated by any single modality; also called supramodal, multimodal, or polymodal. |
| Modality-specific representations | information pertaining to a specific modality of experience and processed within the corresponding sensory, motor, or affective system. Modality-specific representations can include primary perceptual or motor information, as well as more complex or abstract representations that are nonetheless modal (e.g., extrastriate visual cortex, parabelt auditory cortex). Modal specificity refers to the representational format of the information. For example, knowledge about the sound a piano makes is modally auditory, whereas knowledge about the appearance of a piano is modally visual, and knowledge of the feeling of playing a piano is modally kinesthetic. Modal representations reflect relevant perceptual dimensions of the input, that is, they are analogs of the input. An auditory representation, for example, captures the spectrotemporal form and loudness of an input, whereas a visual representation codes visual dimensions such as visual form, size and color. |
| Semantic memory | an individual's store of knowledge about the world. The content of semantic memory is abstracted from actual experience and is therefore said to be conceptual, that is, generalized and without reference to any specific experience. Memory for specific experiences is called episodic memory, although the content of episodic memory depends heavily on retrieval of conceptual knowledge. Remembering, for example, that one had coffee and eggs for breakfast requires retrieval of the concepts of coffee, eggs and breakfast. Episodic memory might be more properly seen as a particular kind of knowledge manipulation that creates spatial-temporal configurations of object and event concepts. |
| Simulation | in cognitive neuroscience, the partial re-creation of a perceptual/motor/affective experience or concept through partial reactivation of the neural ensembles originally activated by the experience or concept. Explicit mental imagery may require relatively detailed simulation of a particular experience, whereas tasks such as word comprehension may require only schematic simulations. |
| Supramodal representations | information that does not pertain to a single modality of experience. Supramodal representations store information about cross-modal conjunctions, such as a particular combination of auditory and visual object attributes. Their existence is sometimes disputed, yet they provide a simple mechanism for a wide range of inferential capacities, such as knowing the visual appearance of a piano given only its sound and knowing about the conceptual similarity structures that define categories. Supramodal representations may also enable the rapid, schematic retrieval of semantic knowledge that characterizes natural language. |
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Appendix A. Supplementary data Supplementary data associated with this article can be found, in the online version, at
References
- 1. Locke J An essay concerning human understanding. Dover; 1690/1959. [PubMed][Google Scholar]
- 2. Tulving E. Episodic and semantic memory. In: Tulving E, Donaldson W, editors. Organization of Memory. Academic Press; 1972. pp. 381–403. [PubMed]
- 3. Allport DA, Funnell EComponents of the mental lexicon. Philos. Trans. R. Soc. Lond. B. 1981;295:379–410.[PubMed][Google Scholar]
- 4. Barsalou LWSituated simulation in the human conceptual system. Lang. Cogn. Processes. 2003;18:513–562.[PubMed][Google Scholar]
- 5. Martin A, Caramazza ANeuropsychological and neuroimaging perspectives on conceptual knowledge: an introduction. Cogn. Neuropsychol. 2003;20:195–212.[PubMed][Google Scholar]
- 6. Damasio H, et al Neural systems behind word and concept retrieval. Cognition. 2004;92:179–229.[PubMed][Google Scholar]
- 7. Patterson K, et al Where do you know what you know? The representation of semantic knowledge in the human brain. Nat. Rev. Neurosci. 2007;8:976–987.[PubMed][Google Scholar]
- 8. Vygotsky LS Thought and language. Wiley; 1962. [PubMed][Google Scholar]
- 9. Fodor J The language of thought. Harvard University Press; 1975. [PubMed][Google Scholar]
- 10. Pylyshyn ZW Computation and cognition: toward a foundation for cognitive science. MIT Press; 1984. [PubMed][Google Scholar]
- 11. Ferrucci D, et al Building Watson: an overview of the DeepQA project. AI Magazine. 2010;31:59–79.[PubMed][Google Scholar]
- 12. Wernicke C Der aphasische Symptomenkomplex. Cohn & Weigert; 1874. [PubMed][Google Scholar]
- 13. Freud S On aphasia: a critical study. International Universities Press; 1891/1953. [PubMed][Google Scholar]
- 14. Damasio ARTime-locked multiregional retroactivation: a systems-level proposal for the neural substrates of recall and recognition. Cognition. 1989;33:25–62.[PubMed][Google Scholar]
- 15. Thompson-Schill SL, et al Role of left inferior prefrontal cortex in retrieval of semantic knowledge: a reevaluation. Proc. Natl. Acad. Sci. U.S.A. 1997;94:14792–14797.[Google Scholar]
- 16. Wagner AD, et al Recovering meaning: left prefrontal cortex guides semantic retrieval. Neuron. 2001;31:329–338.[PubMed][Google Scholar]
- 17. Mahon BZ, Caramazza AA critical look at the embodied cognition hypothesis and a new proposal for grounding conceptual content. J. Physiol. (Paris) 2008;102:59–70.[PubMed][Google Scholar]
- 18. Boulenger V, et al Word processing in Parkinson's disease is impaired for action verbs but not for concrete nouns. Neuropsychologia. 2008;46:743–756.[PubMed][Google Scholar]
- 19. Bak TH, et al Clinical, imaging and pathological correlates of a hereditary deficit in verb and action processing. Brain. 2006;129:321–332.[PubMed][Google Scholar]
- 20. Buxbaum LJ, Saffran EMKnowledge of object manipulation and object function: dissociations in apraxic and nonapraxic subjects. Brain Lang. 2002;82:179–199.[PubMed][Google Scholar]
- 21. Bak TH, Hodges JRThe effects of motor neurone disease on language: further evidence. Brain Lang. 2004;89:354–361.[PubMed][Google Scholar]
- 22. Grossman M, et al Impaired action knowledge in amyotrophic lateral sclerosis. Neurology. 2008;71:1396–1401.[Google Scholar]
- 23. Oliveri M, et al All talk and no action: a transcranial magnetic stimulation study of motor cortex activation during action word production. J. Cogn. Neurosci. 2004;16:374–381.[PubMed][Google Scholar]
- 24. Buccino G, et al Listening to action-related sentences modulates the activity of the motor system: a combined TMS and behavioral study. Brain Res. Cogn. Brain Res. 2005;24:355–363.[PubMed][Google Scholar]
- 25. Pulvermuller F, et al Functional links between motor and language systems. Eur. J. Neurosci. 2005;21:793–797.[PubMed][Google Scholar]
- 26. Glenberg AM, et al Processing abstract language modulates motor system activity. Q. J. Exp. Psychol. 2008;61:905–919.[PubMed][Google Scholar]
- 27. Pobric G, et al Category-specific versus category-general semantic impairment induced by transcranial magnetic stimulation. Curr. Biol. 2010;20:964–968.[Google Scholar]
- 28. Ishibashi R, et al Different roles of lateral anterior temporal lobe and inferior parietal lobule in coding function and manipulation tool knowledge: evidence from an rTMS study. Neuropsychologia. 2011;49:1128–1135.[PubMed][Google Scholar]
- 29. Pulvermuller F, et al Brain signatures of meaning access in action word recognition. J. Cogn. Neurosci. 2005;17:884–892.[PubMed][Google Scholar]
- 30. Boulenger V, et al Cross-talk between language processes and overt motor behavior in the first 200 msec of processing. J. Cogn. Neurosci. 2006;18:1607–1615.[PubMed][Google Scholar]
- 31. Revill KP, et al Neural correlates of partial lexical activation. Proc. Natl. Acad. Sci. U.S.A. 2008;105:13111–13115.[Google Scholar]
- 32. Hoenig K, et al Conceptual flexibility in the human brain: dynamic recruitment of semantic maps from visual, motor, and motion-related areas. J. Cogn. Neurosci. 2008;20:1799–1814.[PubMed][Google Scholar]
- 33. Vigliocco G, et al Toward a theory of semantic representation. Lang. Cogn. 2009;1:219–248.[PubMed][Google Scholar]
- 34. Olson IR, et al The enigmatic temporal pole: a review of findings on social and emotional processing. Brain. 2007;130:1718–1731.[PubMed][Google Scholar]
- 35. Etkin A, et al Emotional processing in anterior cingulate and medial prefrontal cortex. Trends Cogn. Sci. 2011;15:85–93.[Google Scholar]
- 36. Ross LA, Olson IRSocial cognition and the anterior temporal lobes. Neuroimage. 2010;49:3452–3462.[Google Scholar]
- 37. Zahn R, et al Social concepts are represented in the superior anterior temporal cortex. Proc. Natl. Acad. Sci. U.S.A. 2007;104:6430–6435.[Google Scholar]
- 38. Mesulam M. Patterns in behavioral neuroanatomy: association areas, the limbic system, and hemispheric specialization. In: Mesulam M, editor. Principles of Behavioral Neurology. F.A. Davis; 1985. pp. 1–70. [PubMed]
- 39. Orban GA, et al Comparative mapping of higher visual areas in monkeys and humans. Trends Cogn. Sci. 2004;8:315–324.[PubMed][Google Scholar]
- 40. Hodges JR, et al Semantic dementia: progressive fluent aphasia with temporal lobe atrophy. Brain. 1992;115:1783–1806.[PubMed][Google Scholar]
- 41. Mummery CJ, et al A voxel-based morphometry study of semantic dementia: Relationship between temporal lobe atrophy and semantic memory. Ann. Neurol. 2000;47:36–45.[PubMed][Google Scholar]
- 42. Bozeat S, et al Nonverbal semantic impairment in semantic dementia. Neuropsychologia. 2000;38:1207–1215.[PubMed][Google Scholar]
- 43. Hodges JR, et al The role of conceptual knowledge in object use: evidence from semantic dementia. Brain. 2000;123:1913–1925.[PubMed][Google Scholar]
- 44. Bozeat S, et al When objects lose their meaning: what happens to their use? Cogn. Affect. Behav. Neurosci. 2002;2:236–251.[PubMed][Google Scholar]
- 45. Rogers TT, et al Colour knowledge in semantic dementia: it is not all black and white. Neuropsychologia. 2007;45:3285–3298.[PubMed][Google Scholar]
- 46. Lambon Ralph MA, et al Neural basis of category-specific semantic deficits for living things: evidence from semantic dementia, HSVE and a neural network model. Brain. 2007;130:1127–1137.[PubMed][Google Scholar]
- 47. Binder JR, et al Where is the semantic system? A critical review and meta-analysis of 120 functional neuroimaging studies. Cereb. Cortex. 2009;19:2767–2796.[Google Scholar]
- 48. Turkeltaub PE, et al Meta-analysis of the functional neuroanatomy of single-word reading: method and validation. Neuroimage. 2002;16:765–780.[PubMed][Google Scholar]
- 49. Alexander MP, et al Distributed anatomy of transcortical sensory aphasia. Arch. Neurol. 1989;46:885–892.[PubMed][Google Scholar]
- 50. Damasio H. Neuroimaging contributions to the understanding of aphasia. In: Boller F, Grafman J, editors. Handbook of neuropsychology. Elsevier; 1989. pp. 3–46. [PubMed]
- 51. Hart J, Gordon BDelineation of single-word semantic comprehension deficits in aphasia, with anatomic correlation. Ann. Neurol. 1990;27:226–231.[PubMed][Google Scholar]
- 52. Chertkow H, et al On the status of object concepts in aphasia. Brain Lang. 1997;58:203–232.[PubMed][Google Scholar]
- 53. Dronkers NF, et al Lesion analysis of the brain areas involved in language comprehension. Cognition. 2004;92:145–177.[PubMed][Google Scholar]
- 54. Harnad SThe symbol grounding problem. Physica D. 1990;42:335–346.[PubMed][Google Scholar]
- 55. Barsalou LWPerceptual symbol systems. Behav. Brain Sci. 1999;22:577–660.[PubMed][Google Scholar]
- 56. Gallese V, Lakoff GThe brain's concepts: the role of the sensory-motor system in conceptual knowledge. Cogn. Neuropsychol. 2005;22:455–479.[PubMed][Google Scholar]
- 57. Arévalo AL, et al What do brain lesions tell us about theories of embodied semantics and the human mirror neuron system? Cortex. in press. doi: 10.1016/j.cortex.2010.06.001.
- 58. Rogers TT, McClelland JL Semantic cognition: a parallel distributed processing approach. MIT Press; 2004. [PubMed][Google Scholar]
- 59. Dove GOn the need for embodied and dis-embodied cognition. Frontiers Psychol. 2011;1 Article 242. [Google Scholar]
- 60. Taylor LJ, Zwaan RAAction in cognition: the case for language. Lang. Cogn. 2009;1:45–58.[PubMed][Google Scholar]
- 61. Valenstein E, et al Retrosplenial amnesia. Brain. 1987;110:1631–1646.[PubMed][Google Scholar]
- 62. Jefferies E, Lambon Ralph MASemantic impairment in stroke aphasia versus semantic dementia: a case-series comparison. Brain. 2006;129:2132–2147.[PubMed][Google Scholar]
- 63. Fletcher PC, et al Other minds in the brain: a functional imaging study of `theory of mind' in story comprehension. Cognition. 1995;57:109–128.[PubMed][Google Scholar]
- 64. Saxe R, Kanwisher NPeople thinking about thinking people: the role of the temporo-parietal junction in `theory of mind'. Neuroimage. 2003;19:1835–1842.[PubMed][Google Scholar]
- 65. Amodio DM, Frith CDMeeting of minds: the medial frontal cortex and social cognition. Nat. Rev. Neurosci. 2006;7:268–277.[PubMed][Google Scholar]
- 66. Aichhorn M, et al Temporo-parietal junction activity in Theory-of-Mind tasks: falseness, beliefs, or attention. J. Cogn. Neurosci. 2009;21:1179–1192.[PubMed][Google Scholar]
- 67. Van Overwalle FSocial cognition and the brain: a meta-analysis. Hum. Brain Mapp. 2009;30:829–858.[PubMed][Google Scholar]
- 68. Svoboda E, et al The functional neuroanatomy of autobiographical memory: A meta-analysis. Neuropsychologia. 2006;44:2189–2208.[Google Scholar]
- 69. Vilberg KL, Rugg MDMemory retrieval and the parietal cortex: a review of evidence from a dual-process perspective. Neuropsychologia. 2008;46:1787–1799.[Google Scholar]
- 70. Hassabis D, Maguire EADeconstructing episodic memory with construction. Trends Cogn. Sci. 2007;11:299–306.[PubMed][Google Scholar]
- 71. Addis DR, et al Remembering the past and imagining the future: common and distinct neural substrates during event construction and elaboration. Neuropsychologia. 2007;45:1363–1377.[Google Scholar]
- 72. Gerlach KD, et al Solving future problems: default network and executive activity associated with goal-directed mental simulations. Neuroimage. 2011;55:1816–1824.[Google Scholar]
- 73. Gusnard DA, et al Medial prefrontal cortex and self-referential mental activity: relation to a default mode of brain function. Proc. Natl. Acad. Sci. U.S.A. 2001;98:4259–4264.[Google Scholar]
- 74. Whitfield-Gabrieli S, et al Associations and dissociations between default and self-reference networks in the human brain. Neuroimage. 2011;55:225–232.[PubMed][Google Scholar]
- 75. Buckner RL, et al The brain's default network: anatomy, function, and relevance to disease. Ann. N. Y. Acad. Sci. 2008;1124:1–38.[PubMed][Google Scholar]
- 76. Spreng RN, et al The common neural basis of autobiographical memory, prospection, navigation, theory of mind, and the default mode: a quantitative meta-analysis. J. Cogn. Neurosci. 2009;21:489–510.[PubMed][Google Scholar]
- 77. Spreng RN, Grady CLPatterns of brain activity supporting autobiographical memory, prospection, and theory of mind, and their relationship to the default mode network. J. Cogn. Neurosci. 2009;22:1112–1123.[PubMed][Google Scholar]
- 78. Binder JR, et al Conceptual processing during the conscious resting state: a functional MRI study. J. Cogn. Neurosci. 1999;11:80–93.[PubMed][Google Scholar]
- 79. Ingvar DHMemory of the future: an essay on the temporal organization of conscious awareness. Hum. Neurobiol. 1985;4:127–136.[PubMed][Google Scholar]
- 80. Andreasen NC, et al Remembering the past: two facets of episodic memory explored with positron emission tomography. Am. J. Psychiatry. 1995;152:1576–1585.[PubMed][Google Scholar]
- 81. Andrews-Hanna JRThe brain's default network and its adaptive role in internal mentation. Neuroscientist. 2011 doi: 10.1177/1073858411403316. [Google Scholar]
- 82. Fairbanks G Experimental phonetics: selected articles. University of Illinois Press; 1966. [PubMed][Google Scholar]
- 83. Desai RH, et al The neural career of sensorimotor metaphors. J. Cogn. Neurosci. 2011;23:2376–2386.[Google Scholar]
- 84. Felleman DJ, Van Essen DCDistributed hierarchical processing in the primate cerebral cortex. Cereb. Cortex. 1991;1:1–47.[PubMed][Google Scholar]
- 85. Jones EG, Powell TSPAn anatomical study of converging sensory pathways within the cerebral cortex of the monkey. Brain. 1970;93:793–820.[PubMed][Google Scholar]
- 86. Van Hoesen GWThe parahippocampal gyrus: new observations regarding its cortical connections in the monkey. Trends Neurosci. 1982;5:345–350.[PubMed][Google Scholar]
- 87. Squire LRMemory and the hippocampus: a synthesis from findings with rats, monkeys, and humans. Psychol. Rev. 1992;99:195–231.[PubMed][Google Scholar]
- 88. Gorno-Tempini ML, et al Cognition and anatomy in three variants of primary progressive aphasia. Ann. Neurol. 2004;55:335–346.[Google Scholar]
- 89. Rohrer JD, et al Patterns of cortical thinning in the language variants of frontotemporal lobar degeneration. Neurology. 2009;72:1562–1569.[Google Scholar]
- 90. Mion M, et al What the left and right anterior fusiform gyri tell us about semantic memory. Brain. 2010;133:3256–3268.[PubMed][Google Scholar]
- 91. Binney RJ, et al The ventral and inferolateral aspects of the anterior temporal lobe are crucial in semantic memory:evidence from a novel direct comparison of distortion-corrected fMRI, rTMS, and semantic dementia. Cereb. Cortex. 2010;20:2728–2738.[PubMed][Google Scholar]
- 92. Kondo H, et al Differential connections of the temporal pole with the orbital and medial prefrontal networks in macaque monkeys. J. Comp. Neurol. 2003;465:499–523.[PubMed][Google Scholar]
- 93. Badre D, et al Dissociable controlled retrieval and generalized selection mechanisms in ventrolateral prefrontal cortex. Neuron. 2005;47:907–918.[PubMed][Google Scholar]
- 94. Rodd JM, et al The neural mechanisms of speech comprehension: fMRI studies of semantic ambiguity. Cereb. Cortex. 2005;15:1261–1269.[PubMed][Google Scholar]
- 95. Fiez JAPhonology, semantics and the role of the left inferior prefrontal cortex. Hum. Brain Mapp. 1997;5:79–83.[PubMed][Google Scholar]
- 96. Bookheimer SYFunctional MRI of language: new approaches to understanding the cortical organization of semantic processing. Annu. Rev. Neurosci. 2002;25:151–188.[PubMed][Google Scholar]
- 97. Zysset S, et al Functional specialization within the anterior medial prefrontal cortex: a functional magnetic resonance imaging study with human subjects. Neurosci. Lett. 2003;335:183–186.[PubMed][Google Scholar]
- 98. Heatherton TF, et al Medial prefrontal activity differentiates self from close others. Soc. Cogn. Affect. Neurosci. 2006;1:18–25.[Google Scholar]
- 99. Mitchell JP, et al Dissociable medial prefrontal contributions to judgments of similar and dissimilar others. Neuron. 2006;50:655–663.[PubMed][Google Scholar]
- 100. Luria AR, Tsvetkova LSThe mechanism of `dynamic aphasia'. Found. Lang. 1968;4:296–307.[PubMed][Google Scholar]
- 101. Alexander MP, et al Frontal lobes and language. Brain Lang. 1989;37:656–691.[PubMed][Google Scholar]
- 102. Robinson G, et al Dynamic aphasia: an inability to select between competing verbal responses? Brain. 1998;121:77–89.[PubMed][Google Scholar]
- 103. Graves WW, et al Neural systems for reading aloud: A multiparametric approach. Cereb. Cortex. 2010;20:1799–1815.[Google Scholar]
- 104. Binder JR, et al Distinct brain systems for processing concrete and abstract concepts. J. Cogn. Neurosci. 2005;17:905–917.[PubMed][Google Scholar]
- 105. Humphries C, et al Time course of semantic processes during sentence comprehension: an fMRI study. Neuroimage. 2007;36:924–932.[Google Scholar]
- 106. Rauschecker JP, Tian BMechanisms and streams for processing of `what' and `where' in auditory cortex. Proc. Natl. Acad. Sci. U.S.A. 2000;97:11800–11806.[Google Scholar]
- 107. Kravitz DJ, et al A new neural framework for visuospatial processing. Nat. Rev. Neurosci. 2011;12:217–230.[Google Scholar]
- 108. Ferstl EC, et al Emotional and temporal aspects of situation model processing during text comprehension: An event-related fMRI study. J. Cogn. Neurosci. 2005;17:724–739.[PubMed][Google Scholar]
- 109. Ferstl EC, von Cramon DYTime, space and emotion: fMRI reveals content-specific activation during text comprehension. Neurosci. Lett. 2007;427:159–164.[PubMed][Google Scholar]



