Sunday, 22 March 2009

'Over the hill' earlier than we thought?

Youth just got a lot shorter, as far as mental life is concerned.

That mental abilities tend to suffer as we get older, particularly after the age of 60, is a well-established fact. But according to a recent report in the journal Neurobiology of Aging, cognitive function appears to begin its decline at around 27, an age much earlier than traditionally accepted.

Dr. Timothy Salthouse from the University of Virginia combined various methods for assessing mental acuity, including tests of inductive reasoning, spatial visualization, episodic memory, and perceptual speed. Scores peaked at 22 and remained steady until 27, at which point the decline became statistically significant.

Salthouse claims that his study accounts for potentially confounding factors, noting the results remain significant after variables like education level, self-rated health, number of medications taken per week and self-reported measures of depression and anxiety are controlled for.

Most readers will probably find this research a bit unsettling (particularly those of us who are, apparently, already ‘past our prime’). However, a few important points to consider may provide some comfort.

To begin with, the relationship between age and cognitive decline is correlational, meaning getting older doesn’t necessarily cause loss of function. There could potentially be many unaccounted for variables that are responsible for the observed pattern, especially for a trait as complex and nebulous as human cognition.

So if age and the controlled for factors aren’t the whole story, what likely alternative could help to explain the trend?

One unaddressed possibility is the effect of a western, middle-class lifestyle. Cognitive demand and stimulation are not constant throughout our lifetimes. The first 18-25 years are spent learning, both academically and socially. But eventually our education stops (or slows significantly), and sometime during our mid-twenties we begin to settle into a career.

The highly specialized nature of western society, coupled with the traditional model of working in the same field, industry or company until retirement, means the average person doesn’t experience a whole lot of variety. For many, daily life ceases to offer substantial intellectual stimulation or novelty.

Now also consider the well-documented evidence that an enriched environment and high levels of mental stimulation can increase neural plasticity and improve cognitive function. Doctors have known for some time that interaction and mental engagement can ameliorate or reverse dementia and improve cognitive test performance. Mental abilities, therefore, are not fixed, but can be enhanced or deteriorate through experience.

With this in mind, Salthouse’s research can be interpreted somewhat differently. The peak age for test scores was 22 – a time often associated with either the height of academic education or learning new skills in job training. The initial decline at 27 coincides with a point at which many people have begun their careers and stopped acquiring skills and knowledge at a rate comparable to the preceding years. The steady decline up to 60 and beyond may be associated with what sadly amounts to intellectual stagnation.

So loss of mental sharpness corresponds to a similar decrease in diversity of experience, which may be at least partially responsible for the reported pattern of test scores. Thus, instead of responding with panic or defeated acceptance in light of the connection between age and mental decay, we can take comfort in the fact that loss of cognitive abilities is not inevitable.

It may also be wise to contemplate whether the conventional way of life in our society adversely affects our intellectual lives. Humans thrive on variety. Unfortunately, the typical occupation is characterized more by monotony and tedium, providing little satisfaction for our multifarious psychological needs. Perhaps the vocational and professional structure of our culture could be adapted so that a person’s lifetime contribution to society is not to the detriment of their mental and emotional well-being.

Neanderthal rising

A team of scientists from the Max Planck Institute for Evolutionary Anthropology recently announced at the annual meeting of the American Association for the Advancement of Science that 63% of the Neanderthal genome has been successfully sequenced.

Neanderthals, humans’ closest relative in the evolutionary tree of life, lived until around 30,000 years ago but disappeared abruptly from the fossil record. It is believed the evolutionary lines of humans and Neanderthals split approximately 800,000 years ago.

This advancement in genetics can potentially inform the study of human origins as well as the processes of evolution itself, as Svante Pääbo, head researcher for the team, explained at the conference. By comparing Neanderthal DNA to that of modern humans, unique portions of the human genome that appear to have undergone positive selection can be identified, which could help us to understand what genetic factors have contributed to the remarkable success of our species.

Thankfully, in their presentation the authors were careful to delimit precisely the range of appropriate inferences to draw from these findings and, for the most part, withheld from unwarranted speculation. However, their measured comments did not lead the popular press coverage to exercise such restraint.

Most media outlets have focused on two topics: (1) the possibility of cloning a Neanderthal and (2) the fact that Neanderthals and humans share the same version of a particular gene, FOXP2.

Although only given a cursory and dismissive acknowledgement by Pääbo and colleagues, a number of articles featured comments made by a Harvard scientist not involved the research claiming that a Neanderthal could be cloned for $30 million. Putting aside obvious ethical and technological concerns, the notion that resurrecting a Neanderthal in present times could allow us to discover what they were like in their own time suffers from erroneous assumptions and a gross misunderstanding of biology and culture. The hallmark of human – and other higher primate – intelligence is the profound influence that experience and culture have on our potential capabilities. One can look to enculturated apes to see how radically enhanced cognitive abilities become simply from interacting with humans. A Neanderthal’s behaviour would presumably be far more affected by the powerful social, cognitive and physical resources made available from modern human culture. Thus, we could draw few useful conclusions as to the capacities and behaviours of the species as it existed in its own time and place. It would be like observing people in 21st century industrialized nations and assuming ancient humans used iPods and microwaves.

FOXP2 is about as sexy as a gene gets, so it is no surprise the media chose to cover this darling of pop linguistics and psychology. Often inaccurately termed the ‘language gene’, FOXP2 is also linked to a suite of other cognitive and physiological traits. Nevertheless, because Neanderthals possess the human variant, it has been widely reported that this allows speculation as to whether or not Neanderthals could speak. Inherent theoretical problems abound in this line of reasoning, but more important than its questionable validity is its sheer irrelevance. The critical issue is whether or not Neanderthals could communicate as humans do. Neanderthals might well have had a sophisticated communication system with little resemblance to much of human language, and this is a far more interesting possibility.

The widespread coverage of these topics is symptomatic of a tendency to distort scientific research for public consumption. More specifically, it is also a reflection of the dominant genocentric view, which holds that genes have a special causal role in development and can be directly linked to complex behaviours, including language, mate choice, attitude, sexual preference and many others. This position is being challenged by multiples lines of evidence from many and varied disciplines. An interpretation from a non-genocentric perspective would be much more informative than fantastical stories on cloning. Despite this fact, I was unable to find a single article providing alternative implications of constructing the Neanderthal genome.

The public needs and deserves to be accurately informed on scientific research – not merely entertained – because it can ultimately affect their everyday lives. To be sure, the task of translating dense and specialized academic work into an accessible format is a difficult one. However, that process should not compromise the original meaning or result in an implicit endorsement of specific ideologies.