This article was originally published on The Conversation.
The Earth seems to have been smoking a lot recently. Volcanoes are currently erupting in Iceland, Hawaii, Indonesia and Mexico. Others, in the Philippines and Papua New Guinea, erupted recently but seem to have calmed down. Many of these have threatened homes and forced evacuations. But among their less-endangered spectators, these eruptions may have raised a question: Is there such a thing as a season for volcanic eruptions?
Surprisingly, this may be a possibility. While volcanoes may not have “seasons” as we know them, scientists have started to discern intriguing patterns in their activity.
It’s popular to talk about how the original Star Trek, set in the 23rd century, predicted many devices that we’re using already here in 2014. It started with communicators that manifested as flip-open cell phones that many already consider too primitive, moved through computers that talk and recognize human voices and provide instant translation (all of which are constantly improving), to medical applications such as needle-free injection, anti-radiation drugs, and a medical tricorder.
But looking at the more exotic Star Trek technologies, it’s harder to find credible reports that we’re close to a Trek-like world. This is true for Star Trek’s transporter: Despite some success in “quantum teleportation,” which could have applications for computers and possibly communication technology, no experts are saying that this is about to lead to a technology for beaming humans or any other objects from place to place.
It’s also true for space travel. Star Trek depicted a world where people would move between planets and star systems (at least nearby systems) frequently and very swiftly. The United Federation of Planets contains worlds separated by dozens of light-years, which ordinary Earthlings regularly traverse over time periods measured in days to weeks.
Clearly that’s one aspect of Star Trek technology that is far from being a reality in the present day. But the topic isn’t just in the realm of sci-fi: Scientists are taking various approaches to try to create the next generation of space propulsion, beyond the chemical rockets that require most of the mass of the ship to be fuel.
If we want spaceflight to become routine for humans as aviation did, we’ll need major innovations. Are any just around the corner?
This article was originally published on The Conversation.
The past few decades have seen enormous progress being made in synthetic biology – the idea that simple biological parts can be tweaked to do our bidding. One of the main targets has been hacking the biological machinery that nature uses to produce chemicals. The hope is – once we understand enough – we might be able to design processes that convert cheap feedstock, such as sugar and amino acids, into drugs or fuels. These production lines can then be installed into microbes, effectively turning living cells into factories.
Taking a leap in that direction, researchers from Stanford University have created a version of baker’s yeast (Saccharomyces cerevisiae) that contains genetic material of the opium poppy (Papaver somniferum), bringing the morphine microbial factory one step closer to reality. These results published in the journal Nature Chemical Biology represent a significant scientific success, but eliminating the need to grow poppies may still be years away.
The collective space vision of all the world’s countries at the moment seems to be Mars, Mars, Mars. The U.S. has two operational rovers on the planet; a NASA probe called MAVEN and an Indian Mars orbiter will both arrive in Mars orbit later this month; and European, Chinese and additional NASA missions are in the works. Meanwhile Mars One is in the process of selecting candidates for the first-ever Martian colony, and NASA’s heavy launch vehicle is being developed specifically to launch human missions into deep space, with Mars as one of the prime potential destinations.
But is the Red Planet really the best target for a human colony, or should we look somewhere else? Should we pick a world closer to Earth, namely the moon? Or a world with a surface gravity close to Earth’s, namely Venus?
To explore this issue, let’s be clear about why we’d want an off-world colony in the first place. It’s not because it would be cool to have people on multiple worlds (although it would). It’s not because Earth is becoming overpopulated with humans (although it is). It’s because off-world colonies would improve the chances of human civilization surviving in the event of a planetary disaster on Earth. Examining things from this perspective, let’s consider what an off-world colony would need, and see how those requirements mesh with different locations.
Updated 9/16/14 10:15am: Clarified calculations and added footnote
We humans like to think ourselves pretty advanced – and with no other technology-bearing beings to compare ourselves to, our back-patting doesn’t have to take context into account. After all, we harnessed fire, invented stone tools and the wheel, developed agriculture and writing, built cities, and learned to use metals.
Then, a mere few moments ago from the perspective of cosmic time, we advanced even more rapidly, developing telescopes and steam power; discovering gravity and electromagnetism and the forces that hold the nuclei of atoms together.
Meanwhile, the age of electricity was transforming human civilization. You could light up a building at night, speak with somebody in another city, or ride in a vehicle that needed no horse to pull it, and humans were very proud of themselves for achieving all of this. In fact, by the year 1899, purportedly, these developments prompted U.S. patent office commissioner Charles H. Duell to remark, “Everything that can be invented has been invented.”
We really have come a long way from the cave, but how far can we still go? Is there a limit to our technological progress? Put another way, if Duell was dead wrong in the year 1899, might his words be prophetic for the year 2099, or 2199? And what does that mean for humanity’s distant future?
In 1971—16 years after Einstein’s death—the definitive experiment to test Einstein’s relativity was finally carried out. It required not a rocket launch but eight round-the-world plane tickets that cost the United States Naval Observatory, funded by taxpayers, a total of $7,600.
The brainchild of Joseph Hafele (Washington University in St. Louis) and Richard Keating (United States Naval Observatory) were “Mr. Clocks,” passengers on four round-the-world flights. (Since the Mr. Clocks were quite large, they were required to purchase two tickets per flight. The accompanying humans, however, took up only one seat each as they sat next to their attention-getting companions.)
The Mr. Clocks had all been synchronized with the atomic clock standards at the Naval Observatory before flight. They were, in effect, the “twins” (or quadruplets, in this case) from Einstein’s famous twin paradox, wherein one twin leaves Earth and travels nearly at the speed of light. Upon returning home, the traveling twin finds that she is much younger than her earthbound counterpart.
In fact, a twin traveling at 80 percent the speed of light on a round-trip journey to the Sun’s nearest stellar neighbor, Proxima Centauri, would arrive home fully four years younger than her sister. Although it was impossible to make the Mr. Clocks travel at any decent percentage of the speed of light for such a long time, physicists could get them going at jet speeds—about 300 meters (0.2 mile) per second, or a millionth the speed of light—for a couple of days. In addition, they could get the Mr. Clocks out of Earth’s gravitational pit by about ten kilometers (six miles) relative to sea level. And with the accuracy that the Mr. Clocks were known to be capable of, the time differences should be easy to measure.
This post originally appeared at The Abstract.
You are not alone. Your body is a collection of microbes, fungi, viruses… and even other animals. In fact, you aren’t even the only animal using your face. Right now, in the general vicinity of your nose, there are at least two species of microscopic mites living in your pores. You would expect scientists to know quite a lot about these animals (given that we share our faces with them), but we don’t.
Here is what we do know: Demodex mites are microscopic arachnids (relatives of spiders and ticks) that live in and on the skin of mammals – including humans. They have been found on every mammal species where we’ve looked for them, except the platypus and their odd egg-laying relatives.
Often mammals appear to host more than one species, with some poor field mouse housing four mite species on its face alone. Generally, these mites live out a benign coexistence with their hosts. But if that fine balance is disrupted, they are known to cause mange amongst our furry friends, and skin ailments like rosacea and blepharitis in humans. Most of us are simply content – if unaware – carriers of these spindly, eight-legged pore-dwellers.
Scientists from NC State, the North Carolina Museum of Natural Sciences, and the California Academy of Sciences have just published a study that uncovers some previously unknown truths regarding these little-known mites – all the while providing a glimpse into even bigger mysteries that have yet to be solved.
This article was originally published on The Conversation.
We’re getting more stupid. That’s one point made in a recent article in the New Scientist, reporting on a gradual decline in IQs in developed countries such as the UK, Australia and the Netherlands. Such research feeds into a long-held fascination with testing human intelligence. Yet such debates are too focused on IQ as a lifelong trait that can’t be changed. Other research is beginning to show the opposite.
The concept of testing intelligence was first successfully devised by French psychologists in the early 1900s to help describe differences in how well and quickly children learn at school. But it is now frequently used to explain that difference – that we all have a fixed and inherent level of intelligence that limits how fast we can learn.
Defined loosely, intelligence refers to our ability to learn quickly and adapt to new situations. IQ tests measure our vocabulary, our ability to problem-solve, reason logically and so on.
But what many people fail to understand is that if IQ tests measured only our skills at these particular tasks, no one would be interested in our score. The score is interesting only because it is thought to be fixed for life.
Official response to the Ebola outbreak reached new heights today, as the World Health Organization declared the Ebola outbreak a Public Health Emergency of International Concern – a status that allows them to issue recommendations for travel restrictions. “We’re going to see death tolls in numbers that we can’t imagine now,” Ken Isaacs, a vice president at the NGO Samaritan’s Purse, told a congressional hearing yesterday.
The attention on Ebola, and the urgent need for solutions, has focused attention on experimental treatments waiting in the wings – and ignited an ethical debate about whether giving untested drugs to patients is the best course of action.
Based on the most recent official reports, 1,712 people have been infected in the current outbreak. Nearly all of these cases have been in Sierra Leone, Liberia, and Guinea, but another West African country, Nigeria, reports 9 infected people, one of whom died after flying from Liberia. Also, Saudi Arabia reported a likely case after a Saudi man died following a trip to Sierra Leone. And now, the two infected Americans, both stricken with the virus while helping victims in affected areas, have been flown to Atlanta to receive treatment. This will be achieved under special quarantine conditions at Emory University Hospital, where their body fluids will be handled using biohazard level 4 laboratory precautions in which scientists wear outfits resembling spacesuits.
It’s got lots of the trappings of similar science fiction plotlines, such as TNT’s The Last Ship, the topic of my previous post. On that series a viral pandemic, whose symptom profile looks eerily similar to that of Ebola, has killed off 80 percent of humanity. The fictional virus has managed this because it’s 100 percent contagious, nearly 100 fatal, and because the fictional scientists and physicians on the series have insufficient knowledge of the virus and no way to treat or even slow the disease. Such extreme situations facilitate nail-biting drama.
Atlanta, Georgia, prides itself on being a world class city, but in 6,000 years it may be remembered for one thing only: a massive time capsule buried in its midst. The waterproof, airtight, hermetically sealed time capsule, called the Crypt of Civilization, was locked and bolted shut on May 25, 1940 – making it the first ever time capsule in history. Its lofty ideal was to preserve a snapshot of all of civilization up until 1940, with strict orders not to be opened until the year 8113.
The crypt was the brainchild of Oglethorpe University president Thornwell Jacobs. who like many others of his time, was deeply moved when the tombs of the Egyptian pyramids were opened in the 1920s. But those tombs told us little about Egyptian daily life, and Jacobs decided that future civilizations might want a record of ours. And so he invented the time capsule, which has since been imitated around the world, in capsules ranging from the intimate to the immense. The International Time Capsule Society (ITCS) estimates there are now 10,000-15,000 capsules worldwide. However, most of them are forever lost to humanity, their whereabouts forgotten and their records misplaced over the years.
That makes it all the more remarkable that the Crypt persists, unopened but watched over by the university whose grounds it inhabits. Crafted out of a basement room that once held a swimming pool, the Crypt is twenty feet long and ten feet wide with ten-foot ceilings. It’s set in Appalachian granite bedrock under a stone roof seven feet thick, lined with enamel plates embedded in pitch. The only visible marker of its existence above ground is a tiny x carved in a flagstone outside the university’s Phoebe Hearst Memorial.