Sunday 22 March 2009

'Over the hill' earlier than we thought?

Youth just got a lot shorter, as far as mental life is concerned.

That mental abilities tend to suffer as we get older, particularly after the age of 60, is a well-established fact. But according to a recent report in the journal Neurobiology of Aging, cognitive function appears to begin its decline at around 27, an age much earlier than traditionally accepted.

Dr. Timothy Salthouse from the University of Virginia combined various methods for assessing mental acuity, including tests of inductive reasoning, spatial visualization, episodic memory, and perceptual speed. Scores peaked at 22 and remained steady until 27, at which point the decline became statistically significant.

Salthouse claims that his study accounts for potentially confounding factors, noting the results remain significant after variables like education level, self-rated health, number of medications taken per week and self-reported measures of depression and anxiety are controlled for.

Most readers will probably find this research a bit unsettling (particularly those of us who are, apparently, already ‘past our prime’). However, a few important points to consider may provide some comfort.

To begin with, the relationship between age and cognitive decline is correlational, meaning getting older doesn’t necessarily cause loss of function. There could potentially be many unaccounted for variables that are responsible for the observed pattern, especially for a trait as complex and nebulous as human cognition.

So if age and the controlled for factors aren’t the whole story, what likely alternative could help to explain the trend?

One unaddressed possibility is the effect of a western, middle-class lifestyle. Cognitive demand and stimulation are not constant throughout our lifetimes. The first 18-25 years are spent learning, both academically and socially. But eventually our education stops (or slows significantly), and sometime during our mid-twenties we begin to settle into a career.

The highly specialized nature of western society, coupled with the traditional model of working in the same field, industry or company until retirement, means the average person doesn’t experience a whole lot of variety. For many, daily life ceases to offer substantial intellectual stimulation or novelty.

Now also consider the well-documented evidence that an enriched environment and high levels of mental stimulation can increase neural plasticity and improve cognitive function. Doctors have known for some time that interaction and mental engagement can ameliorate or reverse dementia and improve cognitive test performance. Mental abilities, therefore, are not fixed, but can be enhanced or deteriorate through experience.

With this in mind, Salthouse’s research can be interpreted somewhat differently. The peak age for test scores was 22 – a time often associated with either the height of academic education or learning new skills in job training. The initial decline at 27 coincides with a point at which many people have begun their careers and stopped acquiring skills and knowledge at a rate comparable to the preceding years. The steady decline up to 60 and beyond may be associated with what sadly amounts to intellectual stagnation.

So loss of mental sharpness corresponds to a similar decrease in diversity of experience, which may be at least partially responsible for the reported pattern of test scores. Thus, instead of responding with panic or defeated acceptance in light of the connection between age and mental decay, we can take comfort in the fact that loss of cognitive abilities is not inevitable.

It may also be wise to contemplate whether the conventional way of life in our society adversely affects our intellectual lives. Humans thrive on variety. Unfortunately, the typical occupation is characterized more by monotony and tedium, providing little satisfaction for our multifarious psychological needs. Perhaps the vocational and professional structure of our culture could be adapted so that a person’s lifetime contribution to society is not to the detriment of their mental and emotional well-being.

Neanderthal rising

A team of scientists from the Max Planck Institute for Evolutionary Anthropology recently announced at the annual meeting of the American Association for the Advancement of Science that 63% of the Neanderthal genome has been successfully sequenced.

Neanderthals, humans’ closest relative in the evolutionary tree of life, lived until around 30,000 years ago but disappeared abruptly from the fossil record. It is believed the evolutionary lines of humans and Neanderthals split approximately 800,000 years ago.

This advancement in genetics can potentially inform the study of human origins as well as the processes of evolution itself, as Svante Pääbo, head researcher for the team, explained at the conference. By comparing Neanderthal DNA to that of modern humans, unique portions of the human genome that appear to have undergone positive selection can be identified, which could help us to understand what genetic factors have contributed to the remarkable success of our species.

Thankfully, in their presentation the authors were careful to delimit precisely the range of appropriate inferences to draw from these findings and, for the most part, withheld from unwarranted speculation. However, their measured comments did not lead the popular press coverage to exercise such restraint.

Most media outlets have focused on two topics: (1) the possibility of cloning a Neanderthal and (2) the fact that Neanderthals and humans share the same version of a particular gene, FOXP2.

Although only given a cursory and dismissive acknowledgement by Pääbo and colleagues, a number of articles featured comments made by a Harvard scientist not involved the research claiming that a Neanderthal could be cloned for $30 million. Putting aside obvious ethical and technological concerns, the notion that resurrecting a Neanderthal in present times could allow us to discover what they were like in their own time suffers from erroneous assumptions and a gross misunderstanding of biology and culture. The hallmark of human – and other higher primate – intelligence is the profound influence that experience and culture have on our potential capabilities. One can look to enculturated apes to see how radically enhanced cognitive abilities become simply from interacting with humans. A Neanderthal’s behaviour would presumably be far more affected by the powerful social, cognitive and physical resources made available from modern human culture. Thus, we could draw few useful conclusions as to the capacities and behaviours of the species as it existed in its own time and place. It would be like observing people in 21st century industrialized nations and assuming ancient humans used iPods and microwaves.

FOXP2 is about as sexy as a gene gets, so it is no surprise the media chose to cover this darling of pop linguistics and psychology. Often inaccurately termed the ‘language gene’, FOXP2 is also linked to a suite of other cognitive and physiological traits. Nevertheless, because Neanderthals possess the human variant, it has been widely reported that this allows speculation as to whether or not Neanderthals could speak. Inherent theoretical problems abound in this line of reasoning, but more important than its questionable validity is its sheer irrelevance. The critical issue is whether or not Neanderthals could communicate as humans do. Neanderthals might well have had a sophisticated communication system with little resemblance to much of human language, and this is a far more interesting possibility.

The widespread coverage of these topics is symptomatic of a tendency to distort scientific research for public consumption. More specifically, it is also a reflection of the dominant genocentric view, which holds that genes have a special causal role in development and can be directly linked to complex behaviours, including language, mate choice, attitude, sexual preference and many others. This position is being challenged by multiples lines of evidence from many and varied disciplines. An interpretation from a non-genocentric perspective would be much more informative than fantastical stories on cloning. Despite this fact, I was unable to find a single article providing alternative implications of constructing the Neanderthal genome.

The public needs and deserves to be accurately informed on scientific research – not merely entertained – because it can ultimately affect their everyday lives. To be sure, the task of translating dense and specialized academic work into an accessible format is a difficult one. However, that process should not compromise the original meaning or result in an implicit endorsement of specific ideologies.

Sunday 15 February 2009

Why Al Gore is not a nutcase

Not very long ago Al Gore challenged the United States ‘to commit to producing 100 percent of our electricity from renewable energy and truly clean carbon-free sources within 10 years.’ Gore’s attempts to convince the American people that change can and must occur have been aggressively mocked by political opponents, and his ambitious challenge is generally dismissed as inconceivable even by those who are sympathetic to the cause. But there is a lot more substance to his sometimes emotional or patronizing pleas for humanity to deal with the existential crisis of climate change than many prominent figures realize or would like to acknowledge publicly.

There is no denying that Gore’s expectations are high. The breadth and depth of institutional and individual change required to avert ecological disaster is staggering; he is suggesting that the notoriously distractible, apathetic and oftentimes uninformed (one could also argue misinformed) citizens of the US come together in some sort of unprecedented, concerted effort to ensure the livelihood of not only people they may never know in their lifetimes, but also members of completely separate species. Admittedly, he is asking a lot.

One reason to heed Gore’s (sometimes tedious) calls for action stems from the nature of the crisis itself. It would be wrong to judge the necessity for change based on the feasibility of it occurring. Just because to downplay the potential devastation of climate change – and our ability to counteract it – would make us all feel less guilty for failing to adjust our way of life does not, unfortunately, change the fact that the straits are bona fide DIRE. More and more prominent scientists are now publicly expressing concern and stressing the need for immediate and dramatic action. James McCarthy, President of the American Association for the Advancement of Science, recently stated that the opportunity to prevent irrevocable damage to the environment is only within the next four years. At this point it seems safe to conclude there is no question as to the severity and urgency of the problem; unless radical measures are taken to combat current trends, there will be grave consequences for ourselves and the rest of the biological world.

This brings us to the issue of feasibility. Gore optimistically envisions a nation of socially, economically, geographically and politically disparate inhabitants working together towards a goal that will involve considerable lifestyle modification and potentially a fair amount of self-sacrifice. Even to those less inclined to cynicism this scenario would appear unlikely, if not outright preposterous. However, the intuitive and seemingly obvious negative evaluation stems from a common misunderstanding of human behavior and culture, along with a shortsighted view of history.

People have a tendency to think of human existence as fixed and chronologically homogeneous. This conception gives rise to statements like, ‘Things have always been this way, so there’s no changing them’, or the slightly more insightful, ‘Things are better than they’ve ever been, so we should just stick with what we’ve got’. For many organisms on planet earth, the first statement holds, for the most part (obviously, all organisms and environments change through evolution, but at a speed sufficiently slow to not be at issue here). The hallmark of humans, however, is an additional level of evolution, that of culture. Because humans can so adeptly master, manipulate and alter their own environments, what constitutes ‘human experience’ has been forever shifting. If we look at the last 10,000 years alone – a mere fraction of the history of our species – social organization, technology, scientific understanding, belief systems, moral and ethical codes, and virtually all other aspects of human culture have undergone immense change and elaboration.

The important lesson to take from the science of culture and cultural evolution is that our behaviors and capacities are not immutable features of our biology, but instead flexible and open to innovation. Although cultural evolution has, for the most part, remained undirected, it does not necessarily follow that exercising some control over its course is an impossibility (rebutting the second statement above). Paul Ehrlich, an award winning biologist, has for some time commented on the potential for harnessing the power of cultural change as a means to address problems facing society, most recently in regards to climate change . The task of adapting our behavior and infrastructure to avert environmental disaster is daunting, to be sure. But the forces responsible for massive ecological destruction are the same ones that provide a solution. Through deliberate cultural adaptation, it is possible to implement the policies and practices essential to the survival of our civilizations, as well as our fellows in the biological tree of life.

Al Gore’s vision of radical institutional change should not be derided as pure fantasy. An awareness of the mechanisms of cultural evolution provides an opportunity to consciously shape our world and how we interact with it. Once the potential for successful societal transformation is recognized, developing appropriate and effective solutions becomes much more realistic.

Tuesday 3 February 2009

What can a videogame-playing chimpanzee tell us about ourselves? Quite a lot.

The 20th century has seen many attempts by scientists (and pseudoscientists) to teach primates of various sorts human language. Although a number of intriguing examples warrant attention, I will focus on the most remarkable of them, that of the bonobo Kanzi (bonobos (Pan paniscus) are a subspecies of chimpanzee differing morphologically and behaviorally from common chimpanzees (Pan troglodyte)).

Kanzi is famous (infamous is not totally inappropriate) within the fields of linguistics, psychology, and primatology, among others, and his celebrity has even extended into the realm of pop culture (a keyboard duet with Peter Gabriel is not to be missed). Through a fluke of scientific experimentation, Kanzi learned to communicate with visual symbols (so called ‘lexigrams’, or arbitrary images representing words or concepts) and understand spoken English. It isn't necessary to go into the details, except to note that Kanzi was not taught language explicitly; instead, his communicative skills were acquired simply from interacting with humans in a social context. Caregivers talked to Kanzi and made available visual symbols to represent functional aspects of his environment, and in doing so, induced in him the ability to comprehend linguistic material, as well as produce signals – either through lexigrams or gesture -- to communicate meanings back to his caregivers.

The question of whether Kanzi has acquired ‘human language’ is hotly debated, and I will not attempt to address that question here. The crucial point is, even if we judge that Kanzi is not exhibiting a specifically linguistic capacity, his exceptional behaviors differ radically from those that occur in wild bonobo populations. Kanzi is able to interpret speech, communicate about things apart from the here-and-now, and engage in activities as modern as playing Pac-Man. His behavioral repertoire is highly sophisticated, exhibiting many abilities that would otherwise be considered in the domain of humans alone. It is quite evident that this unique developmental environment has significantly augmented his cognitive capacities.

Kanzi’s experience illustrates the hallmark of primate intelligence—the ability for highly social and inquisitive creatures to not only adjust to their environment, but exploit and internalize the structure and tools that are made available to them. Kanzi’s early and continual immersion in a rich, culturally-constructed human setting enhanced his cognitive abilities beyond what anyone would have expected. We can contrast his experience with that of, say, a domestic cat. Cats are raised in the same sort of setting with care and attention. Many owners even talk to their cats. But no cat has ever mastered even the simplest aspects of human communication, let alone seemed interested in doing so. Kanzi shows that there is something special about the primate lineage, and humans are a more striking case than bonobos.

Human behavioral plasticity is one of our most defining characteristics. To understand the extent to which we can enhance our own abilities, we need only look to the historical record. Think about how much the functioning of society relies on written language – a cultural innovation that arose quite recently in human history. Orthography has allowed us both to increase the cognitive capacities of individuals, but also to increase the collective store of cultural knowledge, from which we have continued to develop new tools, that in turn have enhanced individual and societal achievement. The cycle is a self-sustaining and self-enhancing feedback loop. Use of orthography is not a trait bestowed on us from our genes or any other organism-internal mechanism. Yet all normal children, given exposure, manage to acquire these skills, often becoming as second nature.

A lesson we can take from all this is that human potential is crucially dependent on experience; the nature of what humans are capable of attaining is determined in large part by what is available in the developmental environment to be acquired, adjusted and co-opted. One implication that follows from these insights is the debunking of the so-called ‘American Dream’, which holds that anyone can ‘make it’ because equal opportunity is available for all. But opportunity is bound by potential, and as is evident, human potential is heavily influenced by experience—not determined by some kind of inherent, resilient ‘scrappiness’. Children raised in perceptually and intellectually rich and stimulating environments with attentive and informed caregivers are, from the beginning, poised to fulfill the range of possibilities available to them. In turn, however, children in less fortunate circumstances are equally affected by external forces. Severe deprivation, abuse or neglect, as well as seemingly more innocuous factors like limited parental interaction, will irrevocably impact development and ultimately affect behavior in adulthood. Moreover, the relationship between developmental environment and adult behavior is not limited to potential intelligence. Humans are psychologically and emotionally complex beings, and these same factors will also influence beliefs, desires, values, judgments and other variables that determine how we operate in the world. The difference between a positive and enriched upbringing and a depressed or even tragic one often directly corresponds to varying socio-economic conditions. Thus, is it a grave error to assume that overcoming disadvantage is a likely or even trivial affair. Not only is escape from poverty impeded by multiple economic and social barriers, but the desires and motivations to do so are themselves potentially stymied, further decreasing the likelihood that a person will manage to improve his or her situation.

From a global perspective, these concerns become especially pressing. Unfavorable conditions in the United States pale in comparison to the unthinkable suffering caused by violence, war, famine, disease and political unrest experienced by millions of people around the world. It is sobering to consider the developmental consequences for children born into such situations, which will ultimately direct the course of these societies and perpetuate the destitution that created them.

Just as the cycle of cultural accumulation and elaboration can yield extraordinary advancements, its powerful self-enhancing properties can equally foster, sustain and amplify the more iniquitous aspects of the human condition. The machinery of this cycle is distributed across individuals, societies and historical timescales, which renders its operation abstruse and elusive, leaving it free to insidiously influence the course of human culture. When we pair this cycle with the embedded and interacting one of individual development, it becomes clear that events of the past will have nonlinear and often unexpected effects on current conditions. The legacies of persecution, oppression and marginalization are not a passive, historical backdrop, but instead are manifested anew each generation through the mechanisms of cultural transmission and learning. We cannot, therefore, assume that damages will vanish with those who suffered and perpetrated them directly, and we must devise methods to actively counteract the effects that persist to this day.

The sciences of culture and human behavior are urgently germane to solving the injustices that plague so much of humanity. We cannot fashion solutions until we fully understand the problems at hand. Recognizing that human development and cultural evolution are intricately intertwined and inter-dependent processes requires a change in how we view the causes of societal woes like inequality, poverty, crime and many others. Taking this knowledge into account can elucidate the true roots of social and individual disparity and guide us to effective approaches to combating their origination and spread.