As BP’s oil gushed into the Gulf of Mexico week after week last summer, we got accustomed to wildly different estimates for how quickly the oil was leaking and how much entered the gulf. Now, 10 months after the mess began, government and independent scientists have wildly different estimates for how much of the oil remains.
Oceanographer Samantha Joye, speaking at the American Association for the Advancement of Science annual conference in Washington this weekend, revealed the findings of her trips to the Gulf to study the seafloor. In December she dove to areas around the site of BP’s well blowout, finding—and photographing—layers of gunky hydrocarbons. The oil was up to inches thick in places.
“Magic microbes consumed maybe 10 percent of the total discharge, the rest of it we don’t know,” Joye said, later adding: “there’s a lot of it out there.” [AP]
To explain how so much oil got down to the seafloor, Joye’s team did an experiment when they got back to the lab. Joye put a dab of oil from the BP well into a vial of water taken from nearby in the Gulf, then watched.
After just one day, naturally occurring microbes in the water began growing on the oil. After a week, the cells formed blobs, held together by spit, that were so heavy they began sinking to the bottom of a jar. Two weeks later, large streamers of microbial slime and cells were evident. Brown dots visible inside the mix were emulsified oil. “This is the mechanism that we propose deposited oil to the [Gulf’s] bottom,” Joye said. [Science News]
NASA’s big astrobiology news last week had nothing to do with E.T., of course—the team behind a study in Science announced the find of a kind of bacteria that appear to thrive in arsenic and can even use it in place of phosphorus in the backbone of its DNA double helix. But after the big announcement finally happened and squelched the more imaginative rumors, scientists started asking some hard questions about the study online.
Over at Slate, DISCOVER blogger Carl Zimmer rounded up expert critiques from biologists, and many didn’t hold back.
Almost unanimously, they think the NASA scientists have failed to make their case. “It would be really cool if such a bug existed,” said San Diego State University’s Forest Rohwer, a microbiologist who looks for new species of bacteria and viruses in coral reefs. But, he added, “none of the arguments are very convincing on their own.” That was about as positive as the critics could get. “This paper should not have been published,” said Shelley Copley of the University of Colorado. [Slate]
Back in August, a study in Nature attempted to push back the date of human ancestors’ first known tool use by 800,000 years—from 2.6 million years ago to 3.4 million years ago. The evidence was a set of scratches on animal bones, which—according to the scientists behind the study—show evidence that the hominid species Australopithecus afarensis used cutting tools.
Not so fast, some anthropologists say. At the time of the Nature paper, researchers including the scientists behind the 2.6-million-year-old find said the newly found markings could have been caused by other means, including trampling by other animals. Now, in a study (in press) for today’s edition of the Proceedings of the National Academy of Sciences, a team of anthropologists makes a full case that the 3.4-million-year-old scratches are not evidence of tool use.
They argue that similar cuts can be produced when bones are gnawed by animals, trampled into rough ground, or even eroded by plants and fungi. Their conclusion: the marks on the Dikika bones were probably created by trampling and their age is uncertain. To them, the best evidence for butchery by human ancestors comes from stone tools recovered in Gona, Ethiopia, which are just 2.6 million years old.
Check out the rest of this post at Not Exactly Rocket Science.
Not Exactly Rocket Science: Human ancestors carved meat with stone tools almost a million years earlier than expected
80beats: Lucy’s Species May Have Used Stone Tools 3.4 Million Years Ago
DISCOVER: How Loyal Was Lucy?
A group of Swiss astronomers announced yesterday at the International Astronomical Union’s annual meeting in Turin, Italy, that they couldn’t detect the “goldilocks” exoplanet found by U.S. researchers a few weeks ago. That news of that planet, dubbed Gliese 581g, generated much excitement, since researchers said it was only three times the size of Earth, and it appeared to lie in the habitable zone where liquid water could exist on the surface.
It didn’t take long for some cold water to be thrown on the astronomical community and the space-loving public. Presenter Francesco Pepe and his colleagues claim that it will be years before the data is clear enough to see such a planet.
“We do not see any evidence for a fifth planet … as announced by Vogt et al.,” Pepe wrote Science in an e-mail from the meeting. On the other hand, “we can’t prove there is no fifth planet.” No one yet has the required precision in their observations to prove the absence of such a small exoplanet, he notes. [ScienceNOW].
Such small planets are very hard to find. Astronomers discover these planets by calculating how they interact with the star they orbit, making it wiggle ever so slightly. The American team that identified the planet a few weeks ago saw the wiggles when analyzing a combination of two sets of data.
Astronomer Paul Butler, a member of the U.S. team who is at the Carnegie Institution for Science in Washington, D.C., says he can’t comment on the Swiss work because he wasn’t at the meeting and the data are unpublished. He notes, however, that more observations will likely be needed to solidify the existence of Gliese 581g. “I would expect that on the time scale of a year or two this should be settled.” [ScienceNOW].
There will be more information available when the Swiss team releases its data and methods, but for now you might want to unpack your bags.
Bad Astronomy: Possible earthlike planet found in the Goldilocks zone of a nearby star!
Discoblog: So, How Long Would It Take to Travel to That Exciting New Exoplanet?
80beats: New Telescope Could Reveal a Milky Way Packed With Habitable Planets
Bad Astronomy: HUGE NEWS: first possibly Earthlike extrasolar planet found!
80beats: Don’t Pack Your Bags Yet—New Planet-Finder Hobbled by Electronic Glitch
“This is probably going to wind up being the first salvo in a pretty significant debate.” That’s what political scientist Cullen Hendrix told New Scientist in November of last year, when a study came out proclaiming the climate change would spur an uptick in civil wars in Africa. He was correct. This week, another study that will be published (in press) in the same journal—Proceedings of the National Academy of Sciences—says there is no proof to back up such a connection.
The argument for a link between global warming and war came from UC-Berkeley economist Marshall Burke, who said that food shortages and drought brought on by climate change could cause 50 percent more armed conflict by 2030 under the scenarios that climate models predict. However, Norwegian political scientist Halvard Buhaug looked at sub-Saharan civil war over the last half century for this week’s study. When he compared the records of military conflict with the records of temperature and rainfall, did not see a correlation between the two.
[Buhaug] found that that there was a strong correlation between civil wars and traditional factors, such as economic disparity, ethnic tensions, and historic political and economic instability. [BBC News]
When it comes to explaining why the woolly mammoths died out, “death from above” could be down for the count.
Nearly 13,000 years ago, North American megafauna like the mammoths and giant sloths—and even human groups like the people of the Clovis culture—disappeared as the climate entered a cold snap. As DISCOVER has noted before, there’s been a controversial hypothesis bubbling up saying that a comet impact caused it all, but other scientists have been shooting holes in that idea of the last couple years. In a study in this week’s Proceedings of the National Academy of Sciences, a team led by Tyrone Daulton pooh-poohs what may be the last major evidence that supports the impact idea.
That evidence takes the shape of nano-diamonds in ancient sediment layers, a material said to form during impacts only.
The sun is breaking the known rules of physics—so said headlines that made the rounds of the Web this week.
That claim from a release out about a new study by researchers Jere Jenkins and Ephraim Fischbach of Purdue, and Peter Sturrock of Stanford. The work suggests that the rates of radioactive decay in isotopes—thought to be a constant, and used to date archaeological objects—could vary oh-so-slightly, and interaction with neutrinos from the sun could be the cause. Neutrinos are those neutral particles that pass through matter and rarely interact with it; trillions of neutrinos are thought to pass through your body every second.
In the release itself, the researchers say that it’s a wild idea: “‘It doesn’t make sense according to conventional ideas,’ Fischbach said. Jenkins whimsically added, ‘What we’re suggesting is that something that doesn’t really interact with anything is changing something that can’t be changed.'”
Could it possibly be true? I consulted with Gregory Sullivan, professor and associate chair of physics at the University of Maryland who formerly did some of his neutrino research at the Super-Kamiokande detector in Japan, and with physicist Eric Adelberger of the University of Washington.
We can predict your chances of living exceptionally long, with 77 percent accuracy, by looking at 150 tiny genetic variants. That’s what researchers claimed in a Science paper that we described last week. Those predictive powers have left some feeling a little uneasy–and not just about what futures are buried in their genomes. Where the paper‘s authors saw correlations, some experts are now seeing errors from DNA testing chips.
No DNA chip is perfect; it can get things wrong as it sorts through hundreds of thousands of genetic variants. In fact, certain chips might even make the same error repeatedly. That could cause problems, because what looks like a genetic variant common to a group of people could instead just be an echoed flaw in one chip’s testing capabilities.
Newsweek, which broke this story, reports that the Boston University researchers who led the study did, in fact, use different chips, but not enough different chips to rule out this potential error. They used two different types of DNA chips to test the centenarian group (about 1,000 people whose ages ranged from 95 to 119): a 370 chip that examines 370,000 genetic variants and a 610-Quad that examines 610,000 variants. The control group (of about 1,200 younger people) was tested with those two chips and a few others, thus possibly hiding any shared errors.
What struck down ancient Egypt’s King Tutankhamen at the tender age of 19?
Just this winter, Egyptian researchers seemed to think they had a definitive answer. After years of genetic tests and CT scans, they concluded that royal incest had produced a sickly boy with a bone disorder, and argued that a malaria-bearing parasite finished him off. But now a team of German researchers is arguing that the observations actually point to death from the inherited blood disorder sickle cell disease (SCD).
People with SCD carry a mutation in the gene for haemoglobin which causes their red blood cells to become rigid and sickle-shaped. A single copy of the sickle-cell gene confers increased immunity to malaria, so it tends to be common in areas where the infection is endemic – such as ancient Egypt. People with two copies of the gene suffer severe anaemia and often die young. [New Scientist]
It makes for a good movie: 12,900 years ago, a comet slams into Earth, igniting forest fires across North America and sending the planet into a thousand cold years, killing off mammoths, giant sloths, and a bunch of other big mammals. But scientists have fiercely debated whether such a movie, about the cause of the planet-wide cooling period called the Younger Dryas, should be documentary or science fiction. According to a paper recently published in the Geophysical Research Letters, new evidence–or refuted, old evidence–points to science fiction.
Those that think a comet hit the planet cite “carbonaceous spherules” and nanodiamonds found in sediment from the period of the suspected impact. They argue that these particles formed from the intense heat of the collision.
Lead author of this new study, Andrew Scott of the University of London in Egham suspects those spherules are not from a comet collision, but are bug poop, fungal spores, or charcoal pellets.
From a test that measures how much light the spherules reflect, Scott’s team has determined that the spherules were slow-roasted in a low-intensity heat (perhaps from natural wildfires) instead of in intense, comet impact heat. As shown in the figure, the researchers compare the charred spherules to fungal sclerotia, emergency cell balls created by stressed fungi that can germinate after a bad growing period is over, and saw a striking similarity.
Some of the more elongate particles are “certainly fecal pellets, probably from termites,” says Scott…. “There’s certainly no evidence [that any of these particles are] related to intense fire from a comet impact,” says Scott. Part of the problem, he says, is that “there was nobody [among impact proponents] who ever worked on charcoal deposits, modern or ancient. If you’re not familiar with the material, you can make mistakes.” [Science Now]