One of the most pronounced mental attributes of those with talent is the ability to concentrate. The spiritual geniuses of the past emphasised the value of cultivating this ability, which we now know exercises various parts of the brain, increasing neuron activity and the building of new connections, which results in learning. Rick Hanson, in his book The Buddha’s Brain points out that the roots of mystical experience must also lie within the brain. The appearance of books such as this illustrate the encouraging trend of seeing unusual states of mind as having a biological basis, thus rendering them open to scientific investigation.
The concentration on divinity can take the everyday form of icons which the religious associate with a divine or higher form of intelligence. Or, it can take the form of intense meditation, the ability for which must be built up gradually. The significance of this is profound. It indicates that an area of the brain can, in individuals predisposed to it in some way, become prepared for an experience of other worlds.
There is no doubt that a genetic or developmental component of some sort is required. For example, there is no known way in which an individual with a state of mental impairment can be raised to that of a normal human being. During a crucial period of brain-building within the unborn child, the neurons are assembling and distributing themselves at a rate of tens of thousands per second. If the mother is intoxicated during this period, a condition called fetal alcohol syndrome results, in which the fetus’ brain does not carefully assemble itself, but instead, a shower of neurons rises, hits the skull and falls haphazardly into place. The result is a disordered and often violent personality which remains in this state, more or less, for life: a victim of their own mother.
Likewise, there is no way to raise a normal human being to the level of a genius. In fact, the distinct lack of genius in recent times has been pointed out by Scientific American (January 2011: “Where Have All the Geniuses Gone?”). Therefore, it seems unwise to assume an ordinary brain can make the leap all at once to a higher state of consciousness any more than it could rise to the state of a Shakespeare or Einstein overnight, unless higher consciousness is deemed to be simply a happy state of mind. But the other-worldliness of the mystic’s experience, their insights marking a new phase in spiritual development, and the rareness of the same class of individual over the past few thousand years, shows that theirs was not merely a state of euphoria but a display of brain activity next to which the normal brain, even that belonging to renowned writers and scholars, has been completely unable to match. To incorrectly claim, as writers such as Richard Dawkins often do, that such vast appeal can arise only from an easily-swayed mass mind does not explain the difference at all, since modern personalities must also look for, and depend on, the same instinctive recognition.
The plasticity of the brain is now becoming recognised even by science. London taxi drivers, for example, must take a two year course in which they drive the streets of the capital on a small motorcycle, until they have memorised virtually every major thoroughfare and its intersections – an astounding feat. There is no doubt that taxi drivers begin with an interest in this kind of ability, indicating perhaps a predisposition of the brain from the start. Nevertheless, studies at the University College, London have shown that the parts of their brains responsible for spatial location have increased in proportion to their training, and the hippocampus, the area of the brain behind the eyes, remarkably, changes its structure and tissue distribution “to accomodate their huge navigational experience.”
Anyone who has worked in an office will realise that emails, phone calls and unending demands actually prevent individuals from concentrating for long. The Darwinian or Dawkinian solution is dismayingly simple-minded: individuals with the greatest ability to manage in these conditions will procreate more often, and increase the presence of their genes which have, by a lucky fluke, randomly altered, thus giving rise to a society more proficient at this lifestyle.
As was widely reported at the time, natural selection has already been discredited, in research published in Scientific American October 2010. It was shown by a study of certain variations in the genome that natural selection only acts when a single environmental force remains constant for tens of thousands of years – such a rare occurrence in life’s moment-to-moment struggle for survival as to make it useless for coming up with urgent biological refinements. When I raised this on an atheist forum, asking what the replacement theory was, I was asked, “why should we just give up on a theory which explains so much?” The answer of course, is because we see that it isn’t true. The theory of super refinement via random errors is a theory of nothing, and for sure, it deserved to be relegated to the bin in which the Flat Earth and the Edsel were dumped.
But it has another flaw, relevant to this topic, which is that Darwinism, or Dawkinism, assumes that all conditions can be met with random mutations which increase procreation. But the kind of multitasking demanded by modern society is actually damaging to the brain. So far from allowing randomly mutated humans to increase their productivity, the resulting damage points to the inability of humans to vary a device from pre-existing properties, no matter what the pressure. Since these properties are beyond our control, it means humans have to adapt to Nature, and not the other way around.
In Scientific American Mind (2004) Klaus Manhart pointed out research which showed conclusively that the habit of rapidly switching from one task to the next – taking a phone call while working on a spreadsheet while keeping an eye on the incoming emails, for example – resulted in an unnatural use of the brain circuitry. Which, as it happens, results in lasting damage to the short term memory.
This is born out by my own experience and that of colleagues in the business and computing world, where those in the positions of highest stress have the worst memories. While working in London in the 1990’s a colleague cheerfully refused to commute in heavy traffic to avoid “brain damage”. I began to apply this phrase to difficult clients and their chaotic offices, in which I was pressured to work on complicated software while left vulnerable to loud and jocular staff who would suddenly appear at my shoulder to ask “just one quick question” or bang loudly on the door “to see if I was busy”. It amused them that I could be so startled after being in a state of intense concentration that I would actually jump out of my chair.
As a result of this variation on the Chinese water torture, the structures I had been building mentally and translating to the database in a series of on-the-spot revisions were all scattered to the four winds and had to be slowly built up again in my head, though frustratingly never having the grace of the pre-interruption design. Since a portion of my attention had to be devoted to bracing myself for the next amusing interruption, I would sometimes lock myself in the stiflingly hot office, to be alerted instead by heavy banging on the door. I pleaded with the manager to collate all requests and edit out the frivolous or downright stupid ones. Agreeing to no longer knock at the door, her solution was to appear at my ear randomly, like a ghost out of nowhere, and say “knock knock”.
On one memorable shift I worked for 36 hours straight. I arrived home and immediately fell asleep; after one hour the phone rang and I was asked how printer output could be redirected in an emergency. I replied that they could wait until tomorrow, or I could talk them through it and take an extra day off to catch up on my sleep: which did they prefer?
I maintained this effort for some years despite beginning to suspect I really was encountering some kind of brain damage. One day I found myself standing in the kitchen holding an egg in one hand, with absolutely no recollection of how I came to possess it, or what I intended to do next, and I realised I might be better off looking after my brain rather than kicking it to the kerb every day for the sake of money. Thereafter I restricted myself to working on software only at home, or after hours in London when the sudden distractions ceased and I could concentrate all night if I wanted to. Of course I can’t claim my experience constitutes scientific research – there was no laboratory, no control group and peer review, and no dense, lengthy and unreadable dissertation with an obscure reference quoted every five words – but there was the egg, which was good enough for me. After some years, my short term memory improved, and I’m pleased to say it has recovered enough for me to lead a fairly normal – you know, a normal sort of – whatever you call it. As Henry Thoreau was fond of saying, “some circumstantial evidence is very convincing, as when one finds a trout in the milk”.
According to Manhart, the brain does not function in the linear manner of a computer, and requires focus on one job at a time. Forcing it to shift rapidly from one task to the next and back again –the process experienced while, say, trying to talk on the phone and write at the same time – takes much longer than doing each job in turn and results in reduced focus on each job, meaning that both jobs are done to a lower level of proficiency than normal. But apart from the performance issue, there is the stress of shifting jobs in and out of cognitively active areas of the brain, which seems to sustain physical damage in the short term memory.
According to Manhart, brain researcher Ernst Pöppel (Institute for Medical Psychology at the Ludwig Maximilian University in Munich) claims that brain multitasking is only an illusion, because it deals with three second blocks of sensory input at one time. While a computer chip dedicates a certain portion of its extremely rapid cycles per second to a given task, it is also time slicing rather than magically doing several things at once: the difference is that the granularity – the smallest amount of time spent on each task – is exceedingly fine, and being well beneath the granularity of human attention, is as good as true multitasking from our point of view. An additional difference is that the machine is tuned for exactly this sort of behaviour. The brain, however, is not, and obligingly trying to co-operate with our willpower, struggles to keep up. As Manhart points out, it behaves much more like an individual switching TV channels, absorbing one set of input, passing it to short term memory, then switching to another different set of input, and so on.
This effect seems to be confirmed by the results from research teams at the Centre for Cognitive Brain Imaging at Carnegie Mellon University. The scientists used an MRI machine to measure brain activity as subjects listened to sentences being read to them while at the same time mentally rotating two three-dimensional figures. ..What was striking was how brain activity dropped while the subjects tried to perform the two tasks: it was less than two thirds as much as the total devoted to each task when processed independently. “The human brain cannot simply double its efforts when there are two problems to solve at the same time,” concludes Marcel Just, leader of the study.
Another experiment, by psychologist David E. Meyer of the University of Michigan at Ann Arbor quantified just how much time we can lose when we shuttle among tasks. The researchers asked test participants to write a report and check their e-mail at the same time. Those individuals who constantly jumped back and forth between the tasks took about one and a half times as long to finish as those who completed one job before turning to another. Each switchover from one task to the next meant rethinking and thus involved additional neuronal resources. In effect, the brain needs time to shut off the rules for one task and to turn on the rules for another. “Multitasking saves time only when it is a matter of relaxed, routine tasks,” Meyer says.
It also takes the brain longer to change gears when switching back to an interrupted task rapidly, as many multitaskers do, rather than waiting longer before switching back. A fall 2002 study from the National Institute of Mental Health found that the brain has to overcome “inhibitions” it imposed on itself to stop doing the original task in the first place. (Scientific American Mind 2004)
All of this would only be of casual interest if it did not result in some kind of damage along the way. It is unlikely that people would continue to walk or run in a certain way if they experienced pain in their ankles or joints leading to lasting damage. But the brain cannot experience pain, meaning it can actually be cut apart by surgeons while the patient is fully conscious, so it is helpless to notify the individual, even if the individual is cutting merrily away with a scalpel of their own making. The only evidence we have of such damage is in reduced faculties and behaviour it expresses. Even this is not enough to prevent individuals from pushing themselves beyond their limits, as I experienced myself, in an attempt to resolve the manic pressures of business.
The sobering conclusion is this: that large segments of the population are cutting at the roots of their own personalities and mental abilities not in an abstract way, but by physically damaging the one organ on which their entire experience of this world depends: their brain!
By its nature, multitasking is stressful, and the area in the brain most involved with multitasking is also most affected by the resulting stress. Located right behind the forehead, the prefrontal cortex, which neuroscientists call the “executive” part of the brain, helps us assess tasks, prioritize them and assign mental resources. It also “marks” the spot at which a task has been interrupted, so we can return to it later. This area is affected by prolonged stress. Such stress can also affect brain cells in another region, the hippocampus, which is important for forming new memories and accessing existing ones. That damage makes it difficult for a person to acquire new skills and facts.
Pöppel does not recommend mental channel surfing. During such disjointed thinking, connections are lost, and as a result no lasting neuronal representation is created from the information so processed. “In this way, the brain is very conservative and protects itself,” the scientist warns.
Psychiatrists Edward Hallowell and John Ratey of Harvard say that multitasking can cause “pseudo-ADD,” which is different from ADD, attention deficit disorder. Those affected by pseudo-ADD constantly seek new information and have difficulties in concentrating on its content. (Scientific American Mind 2004)
So as regards the question regarding the concept of Darwinism: far from being an organ ready and able to adapt to new demands, the brain has built-in tendencies which when pushed by outside stresses in a direction contrary to them, cause under-performance and even damage to the sensitive neuronal arrangements.
The practice of serene reflection recommended by religion is concentration on a divine concept; such concentration is a natural activity for the brain and has the effect of strengthening the behaviour and neuronal structures. The kind of work undertaken in which jobs are switched rapidly back and forth means, as Pöppel points out, that permanent connections are not made, and therefore the chance for learning is lost. Some pain, but no gain.
Where a society insists on a different use for the brain, it is like running backwards or standing on your hands – sustainable only for a short period of time, and even then likely to result in damage to an organism designed for something completely different. Random mutations will not enable us to convert to always running backwards or upside-down on our hands, even if we were to wait ten thousand years. We use what we are given, and adapt to what we have.
“Truth is a strange substance. It grows stronger in adversity, and enlarges itself in the face of opposition.” The atheism of Richard Dawkins, and the Communism of the former USSR represent two attempts at removing religion from the social landscape. But that social landscape emanates from nowhere except the human brain: The Buddha’s Brain shows that despite the deadening weight of materialist thought, or the active derision of the Dawkinist camp with all their hatred of religion, interest in an age-old discipline of turning away from the perishable trinkets of the Earth, and towards an unbounded inner world, is actually growing stronger.