This article is reposted from the old WordPress incarnation of Not Exactly Rocket Science.
Attention-deficit hyperactivity disorder is the most common developmental disorder in children, affecting anywhere between 3-5% of the world’s school-going population. As the name suggests, kids with ADHD are hyperactive and easily distracted; they are also forgetful and find it difficult to control their own impulses.
While some evidence has suggested that ADHD brains develop in fundamentally different ways to typical ones, other results have argued that they are just the result of a delay in the normal timetable for development.
Now, Philip Shaw, Judith Rapaport and others from the National Institute of Mental Health have found new evidence to support the second theory. When some parts of the brain stick to their normal timetable for development, while others lag behind, ADHD is the result.
The idea isn’t new; earlier studies have found that children with ADHD have similar brain activity to slightly younger children without the condition. Rapaport’s own group had previously found that the brain’s four lobes developed in very much the same way, regardless of whether children had ADHD or not.
But looking at the size of entire lobes is a blunt measure that, at best, provides a rough overview. To get an sharper picture, they used magnetic resonance imaging to measure the brains of 447 children of different ages, often at more than one point in time.
At over 40,000 parts of the brain, they noted the thickness of the child’s cerebral cortex, the brain’s outer layer, where its most complex functions like memory, language and consciousness are thought to lie. Half of the children had ADHD and using these measurements, Shaw could work out how their cortex differed from typical children as they grew up.
We all know them – supremely confident, arrogant people with inflated views of themselves. They strut and swagger, seemingly impervious to critical opinions, threats of failure or the glare of self-awareness. You may be able to tell that I don’t like such people very much, which is why new research from Sander Thomaes from Utrecht University makes me smirk.
Thomaes found that people with unrealistically inflated opinions of themselves, far from proving more resilient in the face of social rebuffs, actually suffer more because of it. Some psychologists hold that “positive illusions” provide a mental shield that buffers its bearers from the threats of rejection or criticism. But according to Thomaes, realistic self-awareness is a much healthier state of mind.
He studied a group of 206 children aged 9-12, a point in life when popularity and acceptance among your peers seems all-important. Every child rated how much they liked each one of their classmates on a scale from zero (not at all) to three (very much). They also predicted the rating that each classmate would give them. The two scores were only moderately related to one another (a correlation of 0.52), and the difference between them provided a measure of each child’s self-awareness. Kids with inflated egos had positive differences while those with negative scores thought worse of themselves than their peers did had.
Two weeks later, Thomaes brought back all the children for an experiment. They were told that they would be taking part in the Survivor Game -an online popularity contest where groups of four players had to complete a personal profile, and a panel of peers would vote out the person they liked the least. The game was a front – in reality, half of the children were randomly told that they were least liked and voted out, while the other half were simply told that this dishonour had befallen someone else.
The autism spectrum disorders (ASDs), including autism and its milder cousin Asperger syndrome, affect about 1 in 150 American children. There’s a lot of evidence that these conditions have a strong genetic basis. For example, identical twins who share the same DNA are much more likely to both develop similar autistic disorders than non-identical twins, who only share half their DNA.
But the hunt for mutations that predispose people to autism has been long and fraught. By looking at families with a history of ASDs, geneticists have catalogued hundreds of genetic variants that are linked to the conditions, each differing from the standard sequence by a single ‘letter’. But all of these are rare. Until now, no one has discovered a variant that affects the risk of autism and is common in the general population. And with autistic people being so different from one another, finding such mutations seemed increasingly unlikely. Some studies have come tantalisingly close, narrowing down the search to specific parts of certain chromosomes, but they’ve all stopped short of actually pinning down individual variants.
This week, American scientists from over a dozen institutes have overcome this final hurdle. By looking all over the genomes of over 10,000 people, the team narrowed their search further and further until they found not one but six common genetic variants tied to ASDs. This sextet probably affects the activity of genes that connect nerve cells together in the developing human brain.
For all appearances, this looks like the skull of any human child. But there are two very special things about it. The first is that its owner was clearly deformed; its asymmetrical skull is a sign of a medical condition called craniosynostosis
that’s associated with mental retardation. The second is that the skull is about half a million years old. It belonged to a child who lived in the Middle Pleistocene period.
The skull was uncovered in Atapuerca, Spain by Ana Gracia, who has named it Cranium 14. It’s a small specimen but it contains enough evidence to suggest that the deformity was present from birth and that the child was about 5-8 years old. The remains of 28 other humans have been recovered from the same site and none of them had any signs of deformity.
These facts strongly suggest that prehistoric humans cared for children with physical and mental deformities that would almost have certainly prevented them from caring for themselves. Without such assistance, it’s unlikely that the child would have survived that long.