Medicalia.org

by Internet Medical Society

Executive Attention

Abstract

We live in a drastically more secure environment than the one we evolved for. This security justifies expanded planning range, which requires corresponding scope of intellectual interests. Both can be delegated to “professionals”, if one chooses to trust them. Most of such attention span expansion is yet to happen (even in the most advanced cultures), on ether individual or on societal level. I’ll speculate on function & mechanics of attention, as well as on the factors that can improve its span & integrity. Not being a neuroscientist, I will theorize even if the evidence (or my knowledge thereof) is tentative.

Function of centralized attention

Attention gives a central focus to cognitive search. Intrinsically, cognition doesn’t need such focus, – search for patterns should be more efficient in a “free market” where the currency is predictive power. But the brain evolved primarily to guide a single body in the environment, hence the artifact of central consciousness. Cognition works on all levels of generalization, but it’s obviously easier to focus on recent experience, – here & now decided life or death for our ancestors. So, more general/remote concepts are relegated to low-priority search in the back-of-the-mind default mode network. And the further in the background these concepts are, the easier it is for direct experience (primary cortices) to inhibit & displace them with more urgent concerns. Still, long-term prediction & planning requires a focus on subjects selected by more general values & represented in correspondingly higher association cortices. This is known as executive function / executive attention. Most value-loaded concepts are conditioned, as well as incrementally generalized from sensory stimuli.  Relatively detached nature of such values makes it difficult to trace their origin & impact on attention, so my interpretation is necessarily speculative. Attention span as discussed here is not a simple duration of focus on a given subject. Rather, it’s a scope of generalized experience (past searches) that determines cognitive priorities by selecting subjects for focused attention.

Values that direct attention can be innate (important only in early development), & acquired/ learned. Acquisition proceeds through purely cognitive generalization by comparison, as well as conditioning by temporal & spatial coincidence with prior value-loaded concepts. The mechanism of conditioning is an evolutionary artifact, I believe implemented in older brain areas: amygdala & hippocampus (known to represent spatio-temporal maps). Conceptually, conditioning should not be necessary for cognition, but with a kludge of a brain that we have it definitely is. The above-mentioned areas are critical for long-term memory formation, probably because spatio-temporal association with value-charged stimuli was the only way for early animals to tell which memories are important enough to preserve.

Thalamo-Cortical system

Mental focus “spotlights” hierarchical context of working memory (current consciousness). This is likely mediated by thalamus, which tunes / binds areas related to working memory via gamma waves. There is a simplified overview in “The Missing Moment” by Robert Pollack, mostly pp. 46-56, for more involved & ambiguous treatment see “Rhythms of the Brain” by Gyorgy Buzsaki.

My personal opinion is that a main function of thalamus is mediation of competitive inhibition both within & between brain regions (the former would be mediated by thalamic nuclei of corresponding regions, & the latter probably by the TRN: the reticular nuclei). From a networking perspective, it’s a lot “cheaper” to mediate such competition for attention by a central body of relays, as opposed to each region / column directly inhibiting all others. In fact, Sherman & Guillery suggest in “Exploring the Thalamus” that a thalamus could be viewed as a consolidated “7th layer” of neocortex. The inputs from “inhibited” areas are relayed by thalamus in a lower-resolution “burst mode”, vs. focused tonic mode.

Primary sensory & motor cortices are over-represented in thalamus, – pulvinar nuclei alone comprise 40% of it. This is introspectively obvious, – working memory is what we currently visualize, vocalize, or actualize. I think we enhance our focus on general concepts in the same fashion, – we generate “fake experience” by subvocalizating, subvisualizating, & subactualizating the feedback of such concepts into primary cortices, which are underutilized during our sensory “vacations” anyway. Also, better thalamic connectivity of primary cortices should speed-up search for relevant associations in other brain areas. So, it seems plausible to me that primary cortices would be frequently “hijacked” by higher areas to simulate (interactively project) their generalized concepts.  Many mathematicians & scientists (such as Einstein) think visually rather than verbally, which suggests that their exceptional ability to focus on imaginary constructs depends on intense primarization (my term).

However, primarization proceeds in cortical areas that are unnaturally “low” for the subject matter. Because “elevation” is wrong, the projections often become false memories & confabulations. I guess in extreme cases hijacking of primary cortices produces hallucinations, – substitution of imagination (feedback) for actual experience (feedforward). This may be a factor in developing schizophrenia, in which “imagination” seems to get out of control. Accordingly, DMN & specifically left posterior cingulate cortex, were found to be unusually active in schizophrenics. Such confusion should be more likely in habitually hijacked primary areas, as they can become less attached to their respective senses.

Primarization could be mediated by short-cuts to lower levels of cortical hierarchy, such as far-reaching spindle neurons. These cells are conveniently located in anterior cingulate cortex & fronto-insular cortex, closely connected to corresponding higher association cortices of the default mode network: the former to dorsolateral & medial prefrontal cortices, & the latter to inferior parietal & posterior cingulate cortices. Feedforward in this network could be through arcuate fasciculus originating in temporal-parietal junction, – also a short-cut vs. feedforward mediated by thalamus.

Another reason for the preference for more direct experience may be that general concepts are located in areas that are topologically distant from primary cortices, thus less capable of hijacking them. Such preference also depends on current intensity of value-loaded stimuli, as modulated by subjective sensitivity to these stimuli. Sensitivity is increased by deprivation (vs addiction) for positive stimuli, & security (vs vulnerability) for the negative ones. Particularly during formative years, executive attention span can be increased by broad intellectual exposure, if combined with low specific pressures & temptations.

More stable | innate factor is the decay rate of stimuli propagating from primary into association areas of neocortex. This rate is probably determined by the speed of dopamine reuptake at the synapses, axonal straightness & myelination, structural trade-offs within cortical minicolumns & thalamus, & so on. Obviously, faster decay should correspond to shorter attention span. For more speculations on such macro-factors see my  “Generalist vs. Specialist bias” knol.

 

Practical implications

The most obvious way to develop focus is practice, which strengthens relevant representations & generates redundant copies to explore alternative scenaria. This also fills up cortical space, which suppresses internal distractions by starving them of resources. Our memories are not passive records, they keep competing for attention until well forgotten. But we need to *begin* practicing (over & over), & the only way facilitate that is by controlling one’s environment. The most basic working “environment” is a notepad or a computer screen, so we fill them with well designed write-ups of the subject matter. Obviously, we have plenty of memory for a few pages of text, the scarce resource here is attention. Writing down thoughts turns them into sensory inputs, which attract attention & make it much easier to maintain focus on the subject.

Of course, we’re social animals, & our most important “environment” is the people we deal with. Hence the urge to bounce our ideas & decision off others: it forces us to focus on the implications. Your listener’s attention (if credible) stimulates yours, even if he doesn’t really contribute anything. But similar effect can be achieved by keeping diaries, or writing blogs & knols that no one reads. Just as important is avoiding irrelevant distractions, AKA “life”. This is obvious, but a dressed-up ape doesn’t care, – there are bananas to be picked. ADD is a universal affliction on this stage of evolution. One solution is a socially-imposed institutional environment, as in a good university or a company. But that requires “consumer competence”, which is sorely lacking in generalized fields. For the “less social” animals, the worst attention hog now is the web, & my solution is rationing. Unless there’s something urgent or work-related (unlikely), I only connect once a day, for ~2 hours. But even more insidious, at least for a generalist like me, are “internal” distractions: wandering thoughts. To focus consistently I have to condition myself to associate a specific desk, computer, & times of the day with work & nothing else. Locking myself in for a fixed time works best. Such cognitive behavioral therapy, incidentally, is also the best way to deal with insomnia, as well as other behavioral problems. Another way to improve focus is by standing up, – we evolved to think best on our feet. This might be one of the reasons for current popularity of mobile devices, but I found that working at a standing computer desk is even better. A stable (boring) circadian cycle is also very important, as are equally boring exercise, low-glycemic diet, & mild stimulants (tea).

Ideally, we should be able to selectively stimulate cortical areas that represent the subject we want to focus on. This may become possible in relatively near future by identifying subject-associated areas in individuals via transcranial imaging (my current bet is on infrared spectroscopy). Visualization of subject-specific cortical activity could help us to control it via neurofeedback, article. Identified areas may also be stimulated via transcranial direct current, magnetic fields, ultrasound, infrared lasers, or even implants. A recent study showed that transcranial DC stimulation of right anterior temporal lobe improves “creative” solving of novel problems. I’d guess that applying reversed polarity over long term should improve general understanding of a subject of study, but this would be much more difficult to test, & will take too long for most to care. Even more effective should be stimulation of left dorsolateral prefrontal cortex, which seems to contain the most general knowledge.

Longer attention span promotes the development of generalized motives, which also depends on value-specific conditioning. BCI-mediated control over the focus of one’s attention will be the most profound revolution yet, – it will change what we want out of life. Yet, waiting for the technology will leave you hopelessly behind those who develop their focus the old-fashioned way.

Comment

You need to be a member of Neuropsychology to add comments!

© 2017   Created by Network Admin.   Powered by

Badges  |  Report an Issue  |  Terms of Service