Tuesday, December 13, 2016

Functional diagram of language areas

Despite numerous studies on functional mapping of language, characterization of language related areas is far from complete. Whether there is a map is a contentious issue in the first place. I don’t want to go into the philosophical debate, since I think it’s becoming clearer now that both sides (1) people who wants you to believe that brain is made of modules that can be labeled using terms defined in preexisting theories, and (2) people who think that that the cerebrum is just a homogeneous sheet of processing units are both wrong.

I do believe in the reality of linguistic units (phoneme, morpheme, phrases, etc.) and I am fond of learning various syntactic theories (government and binding, minimalism, HPSG, etc.), but I think it’s naive to assume that there are brain modules dedicated to levels of language representations and syntactic operations. I am very skeptical of Marr’s levels of design (computational / algorithmic / implementation) when it comes to the brain. Recent development in deep neural networks provides a good counter-example: we can design a processing system in which we cannot label the function of each component clearly, at least in terms of preexisting theories.

Having said that, I do not think that the cerebrum is just a homogeneous sheet. The most important structure obviously emerges from inputs and outputs as well as long and short range connecting fibers. Genes can modulate numerous parameters related to the formation of neo-cortex, although it does not look as heterogeneous as “old” organs (like the heart or subcortical areas) since there has not been nearly as much time historically to tweak these parameters.

So in short, I’ll start with inputs and outputs (for the nature of inputs and output see the appendix), and try to carefully characterize the working of local areas, paying full attention to the fact that traditional way of characterizing language areas may be totally wrong. Main source of information are the review papers by Friederici (2011), which summarizes numerous studies in the linguistic brain are summarized, and Hickok and Poeppel (2007), written by two of the powerhouse of knowledge and critical thinking in language.

Language related event related brain potentials (ERP) provide valuable insights, since ERP research has a rich history of critical works and also it provides timing information.

·      N100, associated with acoustic and phonological process, is localized around the auditory corterx.
·      ELAN (150mS), associated with syntactic category assignment, is localized  around anterior temporal / inferior frontal gyrus (IFG).
·      LAN (400mS), associated with morphosyntactic features for argument structures, is localized around anterior temporal gyrus/IFG.
·      N400, associated with lexical access load, is localized around mid-posterior superior temporal gyrus /IFG.
·      P600, associated with syntactic reanalysis (but can be semantically motivated), is localized around middle temporal gyrus/basal ganglia.

Results of MRI studies are harder to summarize. For one thing, stimulus conditions vary from one study to another. Also labels like “semantic”, “multimodal”, etc.  mean different things in different studies. For instance semantics as in thematic role assignment and word disambiguation are totally different.

So I focus focus on inputs, outputs, long-range connections, and the axes in the frontal and temporal areas along which the quality of information changes most dramatically.  The auditory input to the temporal lobe and the motor output to language related muscle control (see appendix) in the frontal lobe are clear. The output from the speech processing is less clear, seemingly go into higher area including the area 45. The input to the speech motor control is also unclear, which may consist of knowledge to be expressed and the motivation to speak. The main axis of qualitative variation, to me, appears to be the time-range. As the position in the language related cerebral cortex approaches the posterior end of the prefrontal cortex, or closer to the primary auditory cortex in the temporal lobe, the information seems to be more short term (e.g. a simple motor command or a short duration of certain speech spectrum). As the position approaches frontal end of the prefrontal cortex or furthest end from the primary auditory cortex, the information becomes long term (or even static), (e.g. an intended message content or a sentence or even a paragraph).

I will probably keep revising the diagram, but the figure below is the one I got for now.




Appendix: Subcortical language pathways

What kind of auditory signal does the primary auditory cortex (area 41) receive? The auditory pathway from the cochlea to the primary auditory cortex consists of Ventral and Dorsal Cochlear Nucleus, Superior Olivary Complex, Lateral Lemniscus, Inferior Colliculus, and Medial Geniculate Body. So by the time the signal reaches is area 41, it is already processed for directional information as well as for some speech related features such as complex spectrum and onset times. 

Speech motor control requires exquisite coordination of many muscles, so the output from the speech related primary motor cortex (inferior part of area 4, which in turn receives from area 6) goes out to many nerve tracts, including cranial (V:Trigmantal, VII: Facial, IX/X: Glosspharyngeal/Vagus, XII: Hypogrossal) and Laryngeal nerves. Besides these nerves from the primary motor cortex called pyramidal tracts, there are extrapyramidal speech tracts that goes from the cerebellum, premotor cortex via basal ganglia, and output to thalamus, which nevertheless do not directly innervate the lower motor neurons that control speech related muscles. Unlike the auditory signal pathway, coordination of these multiple output does not involve many lower level ganglions.




Caplan D. The neurobiological basis of language. Brain. 2007 May 1;130(5):1442-6.

Damasio AR, Geschwind N. The neural basis of language. Annual review of neuroscience. 1984 Mar;7(1):127-47.

Friederici AD. The brain basis of language processing: from structure to function. Physiological reviews. 2011 Oct 1;91(4):1357-92.

Hickok G, Poeppel D. The cortical organization of speech processing. Nature Reviews Neuroscience. 2007 May 1;8(5):393-402.

Stowe LA, Haverkort M, Zwarts F. Rethinking the neurological basis of language. Lingua. 2005 Jul 31;115(7):997-1042.


Tallal P, Miller S, Fitch RH. Neurobiological basis of speech: a case for the preeminence of temporal processing. Annals of the New York academy of sciences. 1993 Jun 1;682(1):27-47.

Monday, December 5, 2016

Structural connectivity among language areas

Leaving the cortical microstructure behind (at least temporarily), this week’s focus will be on the macro level, i.e. language related brain areas and interconnections among them. I’d like to start with the hardware (citoarchitectural area and structural connectivity level) and build up. Language related brain areas (this itself is a dubious labeling, since the areas mentioned may subserve non-linguistic functions – but I move on), focusing on the temporal and prefrontal cortices.

The temporal cortex here includes areas 41, 42, 21, and 20. The frontal cortex includes 45, 44, (prefrontal) and 6 (premotor). There are structural connections between anterior 22 (anterior) and 45 through Extreme Fiber Capsule System, and between anterior 22 and occluded part of the inferior frontal cortex (Frontal Operculum) through unicite fasciculus (Friederici 2011). Posteriorly there are structural connections between posterior 22 and 44, and between posterior 22 and 6 both served by Arcuate Fascile and Superior Longitudinal Facile III (Friederici 2011). Friederici 2011 focuses on peri-Sylvian cortex, but extending the scope a little to include so called lexical areas (21) and a sensory/motor hub (40) (Hickock and Poppel 2011), I get my Figure 1.

The auditory input flows from 41, 42, and from there flows both anteriorly through and posteriorly through area 22 and 21. The motor intention / control flow is from 45, 44, and to  6, where 6 connects to the motor output area. Fiber tracking in primate brain suggests that the ventral pathway runs from the temporal to prefrontal lobe, whereas the ventral pathway is considered bidirectional (Rauschcker 2011). From here I’d like to characterize processing taking place at each area, but it’s been long enough, so to be continued …



Friederici AD. The brain basis of language processing: from structure to function. Physiological reviews. 2011 Oct 1;91(4):1357-92.

Hickok G, Poeppel D. The cortical organization of speech processing. Nature Reviews Neuroscience. 2007 May 1;8(5):393-402.

Kelly C, Uddin LQ, Shehzad Z, Margulies DS, Castellanos FX, Milham MP, Petrides M. Broca’s region: linking human brain functional connectivity data and non‐human primate tracing anatomy studies. European Journal of Neuroscience. 2010 Aug 1;32(3):383-98.


Rauschecker JP. An expanded role for the dorsal auditory pathway in sensorimotor control and integration. Hearing research. 2011 Jan 31;271(1):16-25.