It’s difficult to deny that humans began as Homo sapiens, an evolutionary offshoot of the primates. Nevertheless, for most of what is properly called “human history” (that is, the history starting with the invention of writing), most of Homo sapiens have not qualified as “human”—and not simply because they were too young or too disabled.
In sociology, we routinely invoke a trinity of shame—race, class, and gender—to characterize the gap that remains between the normal existence of Homo sapiens and the normative ideal of full humanity. Much of the history of social science can be understood as either directly or indirectly aimed at extending the attribution of humanity to as much of Homo sapiens as possible. It’s for this reason that the welfare state is reasonably touted as social science’s great contribution to politics in the modern era. But perhaps membership in Homo sapiens is neither sufficient nor even necessary to qualify a being as “human.” What happens then?
This article was originally published on The Conversation.
In his 1879 account of wanderings in the Orient, the travel writer James Hingston describes how, in West Java, he was treated to a bizarre experience:
I am taken by my kind host around his garden, and shown, among other things, a flower, a red orchid, that catches and feeds upon live flies. It seized upon a butterfly while I was present, and enclosed it in its pretty but deadly leaves, as a spider would have enveloped it in network.
What Hingston had seen was not a carnivorous orchid, as he thought. But the reality is no less weird or fascinating. He had seen – and been fooled by – an orchid mantis, Hymenopus coronatus, not a plant but an insect.
We have known about orchid mantises for more than 100 years. Famous naturalists such as Alfred Russell Wallace have speculated about their extraordinary appearance. Eschewing the drab green or brown of most mantises, the orchid mantis is resplendent in white and pink. The upper parts of its legs are greatly flattened and are heart-shaped, looking uncannily like petals. On a leaf it would be highly conspicuous – but when sitting on a flower, it is extremely hard to see. In photos, the mantis appears in or next to a flower, challenging the reader to spot it.
When we fall ill we visit a clinic or a pharmacy. Our ancestors, however, didn’t have that luxury. Instead, early humans likely observed and learned from sick animals that healed themselves by eating certain plants. Yet, only in the past two decades have biologists and chemists begun to recognize that animals do self-medicate – select and use substances specifically to cure themselves of parasites and ailments.
Early accounts of animal self-medication came in the late 1980s from Michael Huffman, a primatologist at Kyoto University. His decades-long research on chimpanzees, which revealed that they use plant compounds to rid themselves of parasites, helped established self-medication as a fundamental animal behavior.
“Any animal species alive today is alive in part because of its ability to adapt and to fight off diseases,” Huffman says. Self-medication does not require high intelligence, but was simply the reaction of animals to remove an ailing symptom that evolved into strategies to expel parasites. “Self-medication is a very basic behavior that’s important to the survival of so many species,” he says.
And animal self-medication points to a treasure larger than mere fascination. By following the animals’ lead, we tap into a medicine vault furnished by millions of years of natural selection. The world’s best bio-prospectors – the animals themselves – may very well show us new pharmaceuticals to improve the health of our livestock and ourselves.
What would it take for an animal to be considered a person? In a landmark court case that reached its conclusion in a New York State appellate court yesterday, a five-judge panel refused to grant legal personhood to a chimpanzee named Tommy. Their unanimous decision: He’s not a person, in spite of the best arguments put forward by a group called the Nonhuman Rights Project (NhRP).
Tommy’s owner keeps the chimp in a wire-mesh cage, inside a nondescript warehouse, in upstate New York. That’s not illegal, because it’s not illegal to own a chimpanzee in New York State. In the eyes of the law, Tommy isn’t a person – he’s property.
Tommy, the court ruled, “is not a ‘person’ entitled to the rights and protections afforded by the writ of habeas corpus” – the legal term for a petition urging a court to halt the unlawful detention of a prisoner.
The court decision ends this particular battle, but the legal wrangling, and the larger philosophical questions that swirl around human-animal relations, are sure to continue.
This article was originally published on The Conversation.
Why does it take so long for human children to grow up? A male chimp and male human, for example, both end up with the same body weight but they grow very differently: at year one the human weighs twice that of the chimp but at eight the chimp is twice that of the human. The chimp then gains its adult weight by 12 – six years before the human. A male gorilla is also a faster growing primate – a 330-pound male gorilla weighs 110 pounds by its fifth birthday and 265 pounds by its tenth.
Clues to the answer can be found in the young human brain’s need for energy. Radioactive tracers allow scientists to measure the glucose used in different areas of the brain but this procedure is only used rarely when it is justified by investigating neurological problems. However, the few cases we do have reveal how radically different the childhood brain is from that in adults or infants.
From about the age of four to puberty, the young brain guzzles glucose – the cerebral cortex, its largest part, uses nearly (or more than) double that used earlier or later in life. This creates a problem. A child’s body is a third of the size of an adult but its brain is nearly adult-sized. Calculated as a share, a child’s takes up half of all the energy used by a child.
Mixed breed. Mongrel. Roadside setter. A something-something. Dogs of uncertain provenance get called a lot of things. When the animal arrives at a shelter, staff usually can make only an educated guess about the dog’s parentage.
Most of the dogs at my local animal control are assessed as “pit mixes” upon arrival — including the three I’ve adopted over the past 2 years. But a pit bull isn’t a breed: it’s just a type of dog characterized by a short coat, muscular frame and broad, oversized head.
All three of my dogs clearly — at least to my eyes — showed signs of specific breeds somewhere in their heritage: Tall and snow white Pullo looks like the breed standard for an American Bulldog. Tyche’s body is svelte like a boxer’s and inky black like some Labs. And lanky, long-limbed Waldo sometimes bays like a hound, especially when treeing squirrels.
Guessing my dogs’ breeds was a fun parlor game, but I wanted more definitive answers. So I turned to science. And, well, let’s just say it’s a good thing I didn’t place any bets on what was in my dogs’ family trees.
In 2003, two young biology students called Justin Yeager and Mark Pepper were in Costa Rica studying poison dart frogs when their guide presented them with a pair of beautiful orange-yellow and black frogs. They were left speechless, because in front of them was a species that was no longer meant to exist.
The Variable Harlequin Frog, Atelopus varius, had disappeared from cool streams across Costa Rica and Panama in the early 1990s, leaving not even a corpse to mark its existence. Its vanishing, alongside myriad other frogs including the famed golden toad, was later attributed to the wave-like spread of a pandemic pathogen – a fungus responsible for the greatest disease-driven loss of biodiversity in our times – against a backdrop of a changing climate and dwindling and damaged habitats.
In the wake of such carnage, was the variable harlequin frog a lone survivor? Could it increase our understanding of the current mass extinction and help us stem the hemorrhaging of life from our planet?
The harlequin frog, it would turn out, was not alone. Five years after its rediscovery herpetologist Robert Puschendorf was crashing through the dry forests of north Australia when he found a small population of Armored Mist Frog, Litoria lorica, living with the very same chytrid fungus that was believed to have wiped it out 17 years previously. The following year in New South Wales the Yellow-spotted Bell Frog, Litoria castanea, hopped back to life after 30 years without trace. Back in the Americas, Lazarus frogs were reappearing in Ecuador, Venezuela, Colombia and Costa Rica, years and even decades after they were thought to have been wiped out.
When you take a sip of water it doesn’t just slake your thirst. It literally becomes you. The water that runs down your gullet will, within minutes and without processing of any kind, become some of the dominant fluid in your veins and your flesh. Most of your blood is simply tap water with cells, salts, and organic molecules floating in it. Some of the rubbery squishiness of your earlobe poured out of a bottle or a can just a short time ago. And much of the moisture in your eyes only recently fell from rainclouds.
Your mouth is the portal through which water normally enters your body, but you are quite a leaky vessel. A hydrogen isotope study published in the British Journal of Sports Medicine reported that the sedentary men under examination consumed and lost about seven pints of body water per day, with four pints leaving through urine and two or three pints through sweat and breath moisture. Vigorous exercise can boost non-urine water losses to one or two pints per hour.
Now let’s see what logic can do with those facts. Nearly two-thirds of your weight comes from water, and your body is an eddy in a stream of that common fluid. Surely the liquid that you slurp from a fountain is not alive, and you don’t consider it murder to stomp on a puddle of water. Therefore most of you is not alive at all, nor is it even permanent or unique enough to merit a personal name.
This article was originally published on The Conversation.
The past few decades have seen enormous progress being made in synthetic biology – the idea that simple biological parts can be tweaked to do our bidding. One of the main targets has been hacking the biological machinery that nature uses to produce chemicals. The hope is – once we understand enough – we might be able to design processes that convert cheap feedstock, such as sugar and amino acids, into drugs or fuels. These production lines can then be installed into microbes, effectively turning living cells into factories.
Taking a leap in that direction, researchers from Stanford University have created a version of baker’s yeast (Saccharomyces cerevisiae) that contains genetic material of the opium poppy (Papaver somniferum), bringing the morphine microbial factory one step closer to reality. These results published in the journal Nature Chemical Biology represent a significant scientific success, but eliminating the need to grow poppies may still be years away.
This post originally appeared at The Abstract.
You are not alone. Your body is a collection of microbes, fungi, viruses… and even other animals. In fact, you aren’t even the only animal using your face. Right now, in the general vicinity of your nose, there are at least two species of microscopic mites living in your pores. You would expect scientists to know quite a lot about these animals (given that we share our faces with them), but we don’t.
Here is what we do know: Demodex mites are microscopic arachnids (relatives of spiders and ticks) that live in and on the skin of mammals – including humans. They have been found on every mammal species where we’ve looked for them, except the platypus and their odd egg-laying relatives.
Often mammals appear to host more than one species, with some poor field mouse housing four mite species on its face alone. Generally, these mites live out a benign coexistence with their hosts. But if that fine balance is disrupted, they are known to cause mange amongst our furry friends, and skin ailments like rosacea and blepharitis in humans. Most of us are simply content – if unaware – carriers of these spindly, eight-legged pore-dwellers.
Scientists from NC State, the North Carolina Museum of Natural Sciences, and the California Academy of Sciences have just published a study that uncovers some previously unknown truths regarding these little-known mites – all the while providing a glimpse into even bigger mysteries that have yet to be solved.