Human beings are the peak of evolution, right? Our advanced brains allow us to poke one another on Facebook, send rockets to the moon, and order complex drinks at Starbucks. We can even fall in love. How are we able to do all of that? NPR’s Science Podcast has been doing a running series “The Human Edge” in which they discuss various things about humans that make us, well, human. NPR’s John Hamilton tackled brain evolution and how we humans still carry parts of other ancestral animal brains within us. Feel that pebble in your shoe? Thank a jellyfish. Ever duck before a rogue Frisbee collides with your noggin? Thank a lizard. Remember where you left your keys? Thank a mouse.
Hamilton interviewed David Linden from John Hopkins University who explained that our brain is the way it is because evolution is “the ultimate tinkerer and cheapskate.” Evolution built our brain by taking simpler brains and just piling more brains on top – like adding scoops of ice cream to an ice cream cone. Hence, the pieces of our brain inherited from these other creatures are largely unchanged. The result is that our advanced, intricate, special gray-matter is spectacularly inefficient, poorly designed, and ill-suited to many of our daily needs. On the flipside, evolution’s Frankensteinian cobbling together of various animal brains is precisely why human beings can experience higher emotions like love. Our ice-cream-cone-brain has created some of our best, and most uniquely human, attributes. After the jump is an illustrated guide of how the forces of natural selection shaped your mighty mind from distant relatives of jellyfish, lizards, and mice. Read More
Climate scientists predict an above-average number of hurricanes for 2010 (so far we’re well below normal, but hurricane season isn’t over). Hurricanes, with their 75+ mile per hour winds, torrential rains, and associated tornado activity are frigthening. For Earth.
The recent storms that have brought such devastating floods to China and the Iowa, as well as storms depicted in cinema like Twister, The Perfect Storm, and The Day After Tomorrow–all based upon real events (well, 2 out of 3 anyway)– reveal Nature’s fury at its full force, right? Absolutely! For Earth.
There are, however, places in the Solar System where Earth’s most violent maelstroms would be considered puny, and whose most violent wind would be a gentle breeze. The Great Red Spot of Jupiter, for example, is a hurricane-like storm roughly 2 1/2 times the size of the planet Earth that has been raging with winds up to 400 mph, and was first seen by Galileo. In recent years, Jupiter has developed a second red spot (nicknamed “Little Red”) that began as a “perfect storm” where three jovian storms collided back in 2000, and turned red in 2006.
The jovian planets of the outer Solar System are where one truly can view the full force of Nature’s climatic fury. Recent observations of Saturn reveal superstorms (at right) and mega lightning bolts –even a giant blizzard– that put the terrestrial equivalents to shame.
As we explore planets in other star systems, particularly the “hot Jupiters“, we may find superstorms in their atmospheres so huge and violent that they make those in the jovian planets the Solar System as puny in comparison as the storm of Earth are relative to those on, say, Saturn.
Think of the most complicated thing you’ve written. Maybe it was a report for your employer, or an essay while in college. It could even be a computer program. Whatever it was, think of all the stuff you packed into it. Now, pause for a moment to imagine creating all that without using a word processor or a paper and pen, or really anything at all to externalize thought to something outside of your head. It seems impossible. What we get with this technology–ancient as it is–is an amplification of our brain power. Besides their gorgeous techy looks, do interactive holographics like that shown in Iron Man 2, reminiscent of interfaces shown in Minority Report, offer up some of the same brain amping?
Science fiction movies and TV shows are perpetually trying to see through things: Everyone from Superman to last year’s KITT reboot were all using some method or other to see through walls and clothing. Since we already live in the future, see through technology exists in myriad forms, not the least of which is airport full-body scanning. These scanners are so good at seeing past clothing that they might violate child porn laws in the United Kingdom. So now we’re in the position of trying to find ways to make see-through-stuff technology worse.
Enter the non-ionizing terahertz-frequency radiation. The terahertz range sits betwixt the infrared and the microwave bands of the electromagnetic spectrum. Pretty much everything on the planet emits it, and different objects emit different frequencies. Without any need for an emitter, a receiver could be designed to take pictures in the terahertz range. It wouldn’t have sharp lines, but terahertz radiation has a short range, and the emissions vary depending on the object. It would see people as a hazy silhouette. The radiation passes through wood, ceramics, cloth, and paper, but not metal or water.
In a short range situation —- like an airport security scanner —- a receiver could be installed to watch for the pattern of terahertz radiation. A person’s silhouette would show up fine, but a metal knife or handgun would appear as a black outline on the screen. There are already two companies with equipment like this ready to sell, and at least one CEO claiming the technology can be tuned to pick up radiation from drugs or other contraband a person might be carrying.
Not only would the new technology be safer, and avoid privacy concerns, it might make an airport security guard’s job a little better.
Meeting the press during a recent visit to Tokyo, NASA Astronaut Alan Poindexter — Commander of recent Discovery ISS resupply mission STS-131 — was asked if there had been sex in space. His reply was succinct and left no room for ambiguity (though this photo does look pretty chummy):
We are a group of professionals. We treat each other with respect and we have a great working relationship. Personal relationships are not … an issue. We don’t have them and we won’t.
Hang on a second. I’m not sure that the concepts of “sex in space” and “professional” are mutually exclusive. I’m sure that, given what we’ve learned about human physiology because of spaceflight, that there are any number of cardiologists, internists, endocrinologists, OB/GYNs, and a whole host of other health-care professionals and researchers who would love to have physiological data taken of a couple before, during, and after a union in a microgravity environment. These researchers would be the Masters and Johsons, Kinseys, and perhaps even the Shere Hites of their time.
The neurons of a patient suffering from Alzheimer’s.
You may not be consciously aware of it, but at any given time your brain is playing host to billions of simultaneous conversations (and no, I’m not talking about those voices). I speak, of course, of the conversations between your neurons—the incessant neural jabbering that makes it possible for you to move your limbs, learn, remember, and feel pain. Every time we experience a new sensation or form a memory, millions of electrical and chemical signals are propagated across dense networks of axons and jump from one synapse to the next, building new neuronal connections or strengthening existing ones. And they are constantly changing—forming and reforming associations with other neurons in response to how the brain perceives and processes new bits of information.
Despite being central to our understanding of how the brain functions, these neural chats remain largely a mystery to scientists. What exactly are the individual neurons “saying” to each other? And how do these electrical and chemical “messages” become translated into actions, memories, or a range of other complex behaviors? To help decipher these discussions, a team of researchers from the University of Calgary led by bioengineer Naweed Syed have built a silicon microchip embedded with large networks of brain cells. The idea is to get the brain cells to “talk” to the millimeter-square chip—and then have the chip talk to the scientists through a computer interface.
The concept of the Dattoo arose in response to current trends towards increasing connectivity and technology as self-expression. To realize a state of constant, seamless connectivity and computability required the convergence of technology and self. The body would need to literally become the interface. Computers and communication devices require physical space, surfaces, and energy. The idea of DNA tattoos (Dattoos) is to use the body itself as hardware and interaction platform, through the use of minimally-invasive, recyclable materials.
The picture reminds me of the Buzz Lightyear/ Turanga Leela style forearm computer. That seems like a pretty practical place to put a Dattoo. I have a few other ideas: Read More
“Who wants to live forever?” Freddie Mercury asks on behalf of the Highlander. Michio Kaku (whom you should be reading because he’s wonderful) has started a two-part investigation over at Big Think on just that query. The cliché question comes from the basic problem of living a long time: no one wants to die, but no one wants to get old either. Pulitzer Prize-winner Jonathan Weiner‘s new book Long For This World examines the science and scientists of gerontologology (aging). Stanford University professor of internal medicine Abraham Verghese reviewed Long For This World in The New York Times and was inspired by Weiner’s discussion of longevity. Verghese reflects on his own experience with terminally ill patients:
As a young physician caught up in the early years of the H.I.V. epidemic, I was struck by my patients’ will to live, even as their quality of life became miserable and when loved ones and caregivers would urge the patient to let go. I thought it remarkable that patients never asked me to help end their lives (and found it strange that Dr. Kevorkian managed to encounter so many who did). My patients were dying young and felt cheated out of their best years. They did not want immortality, just the chance to live the life span that their peers could expect. What de Grey and other immortalists seem to have lost sight of is that simply living a full life span is a laudable goal. Partial success in extending life might simply extend the years of infirmity and suffering — something that to some degree is already happening in the West.
I cannot get over the logic Verghese displays here. He notes the will of people to live in spite of suffering and lowered quality of life. The patients merely wanted “the chance to live the life span that their peers could expect.” Does he mean the life span science and civilization has already artificially extended fifty years beyond biological design? How does one differentiate between a 30-year-old who wants to be healthy enough to live to fifty and a 90-year-old who wants to be healthy enough to live to be over 100? Verghese is unable to reconcile the desire to live with a terminally low quality of life. The goal of anti-aging is not to simply increase the number of years a person spends alive; instead, the goal is to make every year, even into mid and late life, as healthy and youthful as possible.
Science fiction without science is merely fiction. There are gray levels in how well the science is portrayed in television and cinema, however. For the third straight year, Discover Magazine and the National Academy’s Science and Enterainment Exchange hosted a science-of-science-fiction panel at San Diego Comic-Con, and this year’s theme was “Abusing Science in Science Fiction.” Each panelist provided two video clips from sci-fi television or cinema: one of science done right, and one where the science, well, wasn’t done right.
I’ve always maintained that in science fiction TV and cinema good science should be jettisoned in deference to drama as a last resort only–and then when you have all your other ducks in a row. If the science is solid in the large bulk of your work, we’ll make the leap with you when you get a bit more… speculative. Some works stick to grounded science well, some do not.
Therefore, for my clips, I chose two instances of the same type of event–the impact of a comet/asteroid with Earth — one done well (Deep Impact), one that could have been done better (Armageddon).
Christopher Nolan’s Inception is a film about a time when we have the power to enter into each other’s dreams, and actively steer the dream’s course to implant an idea in the dreamer.
The film raises the issue of how much we understand about the neuroscience of dreams. Due to its need for invasive experiments, neuroscience typically works with non-human animals, which raises a significant difficulty: how do you know that a rat is dreaming? You can’t wake it up from REM sleep and ask. (Well, you can, but don’t expect a cogent response.) There’s no accepted objective indicator that a person or animal is having a dream, as opposed to sleeping. But, we can still learn something useful by looking at the neuroscience of sleep.