Is a Little Radiation Good For You? Trump Admin Steps Into Shaky Science

By Nathaniel Scharping | October 5, 2018 4:40 pm
geiger counter radiation

A technician scans for radiation with a Geiger counter. (Credit: PRESSLAB/Shutterstock)

For decades, studies have shown that even low doses of radiation are harmful to humans.

This week, the Associated Press reported that the Trump administration may be reconsidering that. The Environmental Protection Agency seemed to be looking at raising the levels of radiation considered dangerous to humans based on a controversial theory rejected by mainstream scientists. The theory suggests that a little radiation might actually be good for our bodies. In April, an EPA press release announced the proposal and included supporting comments from a vocal proponent of the hypothesis, known as hormesis. It prompted critical opinion pieces and sparked worry among radiation safety advocates.

Those comments back in April were made by Edward Calabrese, a toxicologist at the University of Massachusetts, Amherst, who also testified before Congress on the issue this week. And in the initial release, Calabrese hailed the EPA’s decision to move away from the radiation dose model widely accepted by the scientific mainstream. But by Friday, the EPA backed away from Calabrese’s stance in comments to Discover.
Read More

MORE ABOUT: cancer, physics

How We Know Ancient Humans Believed In the Afterlife

By Bridget Alex | October 5, 2018 3:11 pm
A burial at Sungir covered in beads. (Credit: José-Manuel Benito Álvarez/Wikimedia Commons)

A burial at Sungir covered in beads. (Credit: José-Manuel Benito Álvarez/Wikimedia Commons)

Some 34,000 years ago, two boys and a middle-aged man were buried in fantastic style. They were laid to rest wearing over 13,000 mammoth ivory beads, hundreds of perforated fox canine teeth and other adornments. Discovered in the 1960s, at the site of Sungir, Russia, the burials also contained spears, figurines and the hollowed out shaft of a woman’s femur, packed with red ochre. Archaeologists estimate the ivory beads alone would have taken 2500 hours of labor to produce.

We’ll never know what particular beliefs these ancient people held. But such elaborate, time-intensive burials strongly suggest they conceived of an afterlife and spiritual forces. It’s something that we see today in nearly every culture on Earth, and it’s usually tied to religious practices and beliefs. In fact, archaeologists often use graves like these as a rough marker for the emergence of religion in human societies.
Read More

CATEGORIZED UNDER: Living World, Top Posts

Could Big Data Unlock the Potential of Predictive Policing?

By Vijay Shankar Balakrishnan | October 2, 2018 12:00 pm
police officer views computer with fingerprint big data

(Credit: Peter Kim/shutterstock)

It’s hard to imagine a nation without an organized police force, but in truth, it’s a fairly modern invention. Crime was once handled locally, often by volunteers and by the will of the ruling power, and it was only in 1829 that the first large-scale, professional force came to be — London’s Metropolitan Police Service, created by Parliamentarian Sir Robert Peel.

These police, nicknamed “peelers” or “bobbies” after their creator, wore uniforms selected to make them look more like citizens than soldiers, followed clear guiding principles and strived not only to fight crimes but also to prevent them from happening. The US followed suit less than two decades later, when the nation’s first metropolitan police department was created in New York City in 1844, based on the London model.

Law enforcement has evolved a lot since then, of course. And in recent decades, information technology has emerged as a significant player in policing. The September 11 attacks in 2001 led to a radical modernization in American policing that included the advent of so-called Big Data  — analysis of large datasets to discover hidden patterns.

Knowable asked criminologist and statistician Greg Ridgeway of the University of Pennsylvania how computers — Big Data in particular — are changing policing in the US. Ridgeway is the author of a 2017 article on the topic in the Annual Review of Criminology. This conversation has been edited for length and clarity.

greg ridgeway

Criminologist and statistician Greg Ridgeway
of the University of Pennsylvania. (Credit: JAMES PROVOST)


Read More
CATEGORIZED UNDER: Technology
MORE ABOUT: computers

Nutrition Science Has a Credibility Problem. We Might Blame Gut Microbes

By Anna Groves | October 1, 2018 5:00 pm
gut microbiome

(Credit: T. L. Furrer/shutterstock)

After decades of research have yielded nothing but flip-flopping dietary advice sprinkled with the occasional scientific scandal, many health-conscious people are placing their trust elsewhere.

Just walk down any aisle of a Whole Foods to find health claims ranging from benignly naturopathic, to pseudo-scientific, to outright anti-scientific. It seems people would rather put faith in a fad diet promoted by a blogger, some guy with a book deal, or their cousin’s sister-in-law on Facebook than advice from doctors and nutrition researchers.

We can’t blame them. Just last month, one of the most prominent researchers in diet and nutrition, Brian Wansink, was removed from his positions at Cornell University after facing allegations of research misconduct. You might remember hearing that your plate size affects how full you feel, or that people duped with a refilling trick bowl eat way more soup than they do when they have to ask for a refill. That guy.
Read More

CATEGORIZED UNDER: Health & Medicine, Top Posts
MORE ABOUT: nutrition, obesity

Ultra-Processed Food: It’s Not Just What We Eat It’s How It’s Made

By Anna Groves | September 28, 2018 5:38 pm
ultra-processed food in a supermarket fridge

Sticking to whole, unprocessed foods might help you get along better with your gut microbes. (Credit: Niloo/shutterstock)

In a time where ephemeral diet advice bombards us from every Instagrammer and morning show, it’s tempting to ignore the latest scientific report claiming to have a helpful idea about obesity.

In a new review, researchers suggest that the consumption of ultra-processed foods could cause obesity and related health problems because of the way the foods feed our gut microbes.

This isn’t the first time we’ve heard about processed foods, nor the first time we’ve heard about gut microbes. There’s research that supports and refutes the impact of both on our health.

But rather than try to hone in on a single component of the obesity-prone Western diet – carbohydrates, or fats, or overconsumption – the new report takes aim at a bigger picture, formulating a hypothesis that allows for messy contributions from “all of the above.” The mechanism that underlies it all, they say, is the bacteria that live in our gut, helping us digest the food we eat.

The result is a review paper that makes as much of a statement about nutrition research as it does about processed foods.

Ultra-processed Foods

Many ultra-processed foods fall into the traditional “junk food” category – candy, for example. But the scientific term has more to do with how the foods are made.

There’s a difference between “processed” and “ultra-processed” food, though where to draw the line is still a topic of debate. Just salting cocktail peanuts earns them a “processed” label. But “ultra-processed” is reserved for the worst offenders: foods where almost all ingredients are refined. If it has ingredients like hydrogenated oils, preservatives, emulsifiers, and natural and artificial flavorings, it’s a good hint that it’s “ultra” processed.

To make ultra-processed foods, materials are broken down into their component parts, the pieces re-assembled with added sugars, salts, and fats to maximize shelf life and taste while minimizing costs.

They’re things like cookies, breakfast cereal, pre-prepared frozen meals, packaged breads, and sugar-sweetened beverages.

Though the best way to scientifically classify foods by their degree of processing is an ongoing debate, the general idea seems clear enough.

Inge Lindseth is a clinical nutrition physiologist at the Balder Clinic in Oslo.

He and his colleague, Marit Zinöcker at Bjørknes University College, propose that the way these foods are made could be what makes them feed the gut microbiota differently – for better or for worse.

Actually, just worse.

Let Us Count the Ways

“Eating pizza or a rice dish isn’t that different to a hungry human being, provided the energy content is the same in each meal,” says Lindseth. “But to the microbiota it could mean a world of a difference.”

Imagine a rice and vegetable stir-fry. These main ingredients have their original plant cells structurally intact.

But what if you ground the rice into fine flour and made rice noodles, and topped it with sauce constructed with natural vegetable flavor? Now there are no whole cells left, only the acellular compounds that were once in the foods. This makes the nutrients in the dish, in a way, already partly digested.

For bacteria living in the digestive tract, this means nutrients are available for them to eat sooner and in greater quantities since they don’t have to break down any cell walls or membranes. A constant influx of insta-nutrients could trigger an increase in the bacteria’s growth, an expansion of their territory, a change in their composition, or a change in their behavior – like what they eat or what by-products they produce.

This potential gut disaster could be exacerbated further by an overload of sugar or carbohydrates (which are, of course, both also common in ultra-processed foods). Some studies have shown that if we exceed the sugar-uptake potential of our small intestines, what’s left over can also create a harmful breeding ground for microbes.

There’s more: from certain food additives like emulsifiers, to fat content, to lack of dietary fiber – there’s a laundry-list of ways ultra-processed foods could mess with gut microbes. If any of these hypotheses turned out to be unsupported after further research, there are a dozen more possibilities waiting in the ranks.

Gut Reaction

Yet the impact of the gut microbiome on obesity and metabolic health is still debated.

A 2016 review weaved together evidence from 94 scientific publications to explain how gut microbes shape health. This team concluded that the evidence suggests gut microbes are a strong contributing factor for obesity and metabolic diseases, particularly when the composition of the microbiota leads to inflammation in the gut.

But others disagree, suggesting that microbes might matter a little for obesity, but their effect is small when compared to factors like total calorie consumption. One of these critics is Pat Schloss, professor in the department of microbiology and immunology at University of Michigan Medical School.

Schloss also reviewed the microbiome-obesity literature, and found not only are the effects small, but combining small effects with the immense variation in microbial communities between people makes these questions difficult to study without extremely huge sample sizes.

“When any field is brand new there’s tons of enthusiasm, and then there’s maybe a bit of course correction to say, okay, what do we really think about this,” says Schloss. “I think there’s something there, but we do need to be careful about what we say about what’s linked and what’s not linked.”

Seriously, Though, Go Eat Some Whole Foods

Zinöcker and Lindseth conclude their report with a refreshingly honest message: While we work on this gut microbe thing, and even if we’re wrong, there’s really already enough evidence to suggest we should all be eating mostly whole, unprocessed foods.

Other researchers agree. Although the reasons why are still debated, the epidemiological links between ultra-processed foods and things like obesity, cholesterol, and risk of hypertension are strong.

CATEGORIZED UNDER: Health & Medicine, Top Posts
MORE ABOUT: nutrition, obesity

Chimps Know Death When They See It

By Bridget Alex | September 28, 2018 4:52 pm
Noel, a chimpanzee, used a grass stem to pick debris from the teeth of a dead chimp in a sanctuary in Zambia.

Noel, a chimpanzee, used a grass stem to pick debris from the teeth of a dead chimp in a sanctuary in Zambia. (van Leeuwen, Cronin, and Haun; Scientific Reports Volume 7, March 2017)

After Rosie’s mother died, she accompanied the lifeless body throughout the night, in apparent mourning. When Noel lost her adopted son, she picked his teeth clean with a grass stem. And Jire carried her infant’s corpse for 68 days after the one-year-old succumbed to a respiratory infection.

Rosie, Noel and Jire are chimpanzees, whose responses to death were documented by researchers. Their behavior makes one wonder: Do chimps and other animals understand death, or are humans the only species conscious of mortality?

To completely answer this, we’d need to read animal minds. Short of that, scientists try to infer animals’ inner thoughts from their outward behaviors. Based on such observations, here’s what we know, about what chimps know, about death.

How Animals Deal with Death

Some animals treat the corpses of community members in specific ways. For example, in social insects, like ants and termites, the dead are eaten, buried or removed from the colony, depending on the cause and context of the death. Corvid birds, including magpies and crows, gather around their deceased, as if having a funeral. And mothers have been seen carrying dead infants for hours to weeks in diverse species, like elephants, whales and dingos. But it’s

A chimpanzee named Jire carried the mummified remains of her dead infant for 68 days after its death in Bossou, Guinea.

A chimpanzee named Jire carried the mummified remains of her dead infant for 68 days after its death in Bossou, Guinea. (Credit: Biro et al; Current Biology Volume 20, Issue 8, April 2010)

unclear from these behaviors exactly how these animals conceive death.

If any creature gets death, it may be chimpanzees. The two species, common chimpanzees (Pan troglodytes) and bonobos (Pan paniscus), are our closest evolutionary relatives, sharing ancestors with humans roughly 8 million years ago. And some of the cognitive abilities humans use to understand death might have existed in this common ancestor and been retained through both human and chimp evolution.

Back in 1973, anthropologist Gez Teleki published the first account of wild chimps reacting to an accidental death. He recounted the aftermath of a chimp’s fatal fall from a tree in Tanzania. The 16 group-members present erupted into “raucous calling … slapping and stamping the ground, tearing and dragging vegetation, and throwing large stones.” Individuals frantically embraced one another (and two copulated), before the group eventually calmed and abandoned the scene.

Since then, more reports of chimp behaviors surrounding death have slowly accumulated, especially in the past decade. “Almost every new observation can help us to build some kind of picture of what’s going on,” says James Anderson, a psychologist who has worked to establish Pan thanatology, the scientific study of death among chimps.

Chimps inspect the corpse of one of their group members

Chimps inspect the corpse of one of their group members (Credit: van Leeuwen et al; American Journal of Primatology Volume 78, 2016)

The compiled observations reveal a variety of behaviors. When witnessing death of their own kind, chimps have responded with everything from agitated frenzy to quiet contemplation, embodied by watching, sniffing and prodding the corpse. In some cases, chimps have tenderly groomed the bodies or covered them with plants. In other cases they’ve attacked and cannibalized them. One bonobo mom groomed her dead infant immediately before eating it with other members of the group. As primatologists Watson and Matsuzawa put it, “the juxtaposition of care and cannibalism is puzzling.” Indeed.

Thoughts Behind the Behaviors

Considering chimps behave so variably towards death, can we make any generalizations about their understanding of it? Like human children, chimps probably learn about death. But unlike children, who can be told about death, chimps must learn from direct observations. Therefore, chimps’ variable responses are likely due to their differing prior experiences as well as relationships to the deceased.

But there may be a limit to what chimps can learn about death. To assess that limit, Anderson considered four concepts that human kids must grasp in order to understand death: the ideas that dead individuals cannot think or feel (non-functionality), and that state is permanent (irreversibility), inevitable (universality) and due to a specific cause (causality).

After reviewing behavioral observations, Anderson contends mature and experienced chimps can grasp death’s irreversibility and non-functionality. “I think they are clearly capable of understanding that these dead individuals aren’t going to suddenly come back to life, and they cannot feel or hear or see anything, regardless of what gets done to them,” says Anderson, a professor at Kyoto University, Japan.

Supporting this conclusion is the fact that chimpanzees deliberately kill other animals and other chimps. Furthermore, chimps in Côte d’Ivoire have been seen grooming and licking wounds of injured, but not dead, companions.

Regarding causality, they likely understand violent deaths, especially the ones they inflict on other animals. There’s no evidence chimps comprehend invisible causes of death, such as disease and old age.

Universality is the most difficult component to judge from behavior. Anderson makes the distinction between can and will: chimps seem to understand that animals can die, but it’s unclear if they know all animals — including themselves — will die.

“How can we ask them, ‘do you know that every individual is going to die?’” says Anderson.

For better or worse, humans may be the only animals that do know this.

CATEGORIZED UNDER: Living World, Top Posts
MORE ABOUT: animals, emotions, evolution

Decades After Bomb-making, the Radioactive Waste Remains Dangerous

By Valerie Brown | September 28, 2018 3:34 pm
nuclear explosion

An atmospheric nuclear test carried out on April 18, 1953. Such bombs made use of plutonium-239, and the government is still trying to figure out what to do with the waste created by its production. (Credit: National Nuclear Security Administration/Nevada Site Office)

(Inside Science) — Nearly 30 years ago, the state of Washington and two federal agencies agreed to clean up the Hanford Nuclear Reservation, a 586-square-mile chunk of sagebrush desert where the U.S. produced plutonium for nuclear weapons starting 75 years ago. In the process, half a trillion gallons of chemically toxic and radioactive waste was dumped on the ground or injected into groundwater. Some of it has reached the Columbia River. Another 56 million gallons of concentrated, radioactive sludge and crystallized salts sit corroding within 177 steel-and-concrete underground tanks.

Although the tank waste is only a fraction of the total, its safe disposal is one of the site’s most urgent priorities, especially to the policymakers and residents of Washington and Oregon. Eighteen years ago, workers began constructing a plant for “immobilizing” the remaining waste by vitrifying it — a process whereby it is mixed with molten glass, cooled and encased in stainless steel canisters for long-term storage underground in an as yet undesignated location. Read More

CATEGORIZED UNDER: Space & Physics, Technology

Searching for Chocolate’s Roots, and Enemies, in Colombia’s Wilderness

By Lindzi Wessel | September 27, 2018 11:30 am
cacao pod removed by farmer

Cacao farmer Gildardo Ramirez removes a withering pod from one of his trees. He suspects the pod is infected with black pod rot. (Credit: Lindzi Wessel)

With a machete, Gildardo Ramirez lops twelve pods off one of his cacao trees, letting them fall to its base. The long, brown pods look like twisted and deflated footballs. Each cacao pod usually encases about 40 beans — the source of cocoa powder and chocolate. The beans are the main commodity that Ramirez produces on his farm in San Francisco, Colombia, some 70 miles southeast of the city of Medellín. On Ramirez’s land, cacao’s red and green leaves fill the sloping hillside, overlooked by lush green mountains. But these twelve pods will never make chocolate. The healthy white, sweet pulp that normally encases the beans has turned dank and discolored. The pods are diseased — infected, he suspects, with a mold-like attacker called black pod rot. It’s just one of many threats that plague the region’s chocolate farms.

Read More

CATEGORIZED UNDER: Living World, Top Posts

These Spray-on Antennas Could be the Future of Communication

spray on antenna

Spraying an antenna onto a flat surface. (Credit: Drexel University Nanomaterials Lab, CC BY-ND)

A version of this article originally appeared on The Conversation.

Hear the word “antenna” and you might think about rabbit ears on the top of an old TV or the wire that picks up radio signals for a car. But an antenna can be much smaller – even invisible. No matter its shape or size, an antenna is crucial for communication, transmitting and receiving radio signals between devices. As portable electronics become increasingly common, antennas must, too.

Wearable monitors, flexible smart clothes, industrial sensors and medical sensors will be much more effective if their antennas are lightweight and flexible – and possibly even transparent. We and our collaborators have developed a type of material that offers many more options for connecting antennas to devices – including spray-painting them on walls or clothes.
Read More

CATEGORIZED UNDER: Technology
MORE ABOUT: nanotechnology

A Grocery List in 1968 Changed Computer History Forever

By Margaret O'Mara, University of Washington | September 24, 2018 5:04 pm
Doug Engelbart demo 1968

A scene from Doug Engelbart’s groundbreaking 1968 computer demo. (Credit: Doug Engelbart Institute)

A version of this article originally appeared on The Conversation.

On a crisp California afternoon in early December 1968, a square-jawed, mild-mannered Stanford researcher named Douglas Engelbart took the stage at San Francisco’s Civic Auditorium and proceeded to blow everyone’s mind about what computers could do. Sitting down at a keyboard, this computer-age Clark Kent calmly showed a rapt audience of computer engineers how the devices they built could be utterly different kinds of machines – ones that were “alive for you all day,” as he put it, immediately responsive to your input, and which didn’t require users to know programming languages in order to operate.

Engelbart typed simple commands. He edited a grocery list. As he worked, he skipped the computer cursor across the screen using a strange wooden box that fit snugly under his palm. With small wheels underneath and a cord dangling from its rear, Engelbart dubbed it a “mouse.”

The 90-minute presentation went down in Silicon Valley history as the “mother of all demos,” for it previewed a world of personal and online computing utterly different from 1968’s status quo. It wasn’t just the technology that was revelatory; it was the notion that a computer could be something a non-specialist individual user could control from their own desk.

The prototype computer mouse Doug Engelbart used in his demo.  (Credit: Michael Hicks)

The prototype computer mouse Doug Engelbart used in his demo. (Credit: Michael Hicks)

Shrinking the Massive Machines

In the America of 1968, computers weren’t at all personal. They were refrigerator-sized behemoths that hummed and blinked, calculating everything from consumer habits to missile trajectories, cloistered deep within corporate offices, government agencies and university labs. Their secrets were accessible only via punch card and teletype terminals.

The Vietnam-era counterculture already had made mainframe computers into ominous symbols of a soul-crushing Establishment. Four years before, the student protesters of Berkeley’s Free Speech Movement had pinned signs to their chests that bore a riff on the prim warning that appeared on every IBM punch card: “I am a UC student. Please don’t bend, fold, spindle or mutilate me.”

Earlier in 1968, Stanley Kubrick’s trippy “2001: A Space Odyssey” mined moviegoers’ anxieties about computers run amok with the tale of a malevolent mainframe that seized control of a spaceship from its human astronauts.

Voices rang out on Capitol Hill about the uses and abuses of electronic data-gathering, too. Missouri Senator Ed Long regularly delivered floor speeches he called “Big Brother updates.” North Carolina Senator Sam Ervin declared that mainframe power posed a threat to the freedoms guaranteed by the Constitution. “The computer,” Ervin warned darkly, “never forgets.” As the Johnson administration unveiled plans to centralize government data in a single, centralized national database, New Jersey Congressman Cornelius Gallagher declared that it was just another grim step toward scientific thinking taking over modern life, “leaving as an end result a stack of computer cards where once were human beings.”

The zeitgeist of 1968 helps explain why Engelbart’s demo so quickly became a touchstone and inspiration for a new, enduring definition of technological empowerment. Here was a computer that didn’t override human intelligence or stomp out individuality, but instead could, as Engelbart put it, “augment human intellect.”

While Engelbart’s vision of how these tools might be used was rather conventionally corporate – a computer on every office desk and a mouse in every worker’s palm – his overarching notion of an individualized computer environment hit exactly the right note for the anti-Establishment technologists coming of age in 1968, who wanted to make technology personal and information free.

Over the next decade, technologists from this new generation would turn what Engelbart called his “wild dream” into a mass-market reality – and profoundly transform Americans’ relationship to computer technology.

Government Involvement

In the decade after the demo, the crisis of Watergate and revelations of CIA and FBI snooping further seeded distrust in America’s political leadership and in the ability of large government bureaucracies to be responsible stewards of personal information. Economic uncertainty and an antiwar mood slashed public spending on high-tech research and development – the same money that once had paid for so many of those mainframe computers and for training engineers to program them.

Enabled by the miniaturizing technology of the microprocessor, the size and price of computers plummeted, turning them into affordable and soon indispensable tools for work and play. By the 1980s and 1990s, instead of being seen as machines made and controlled by government, computers had become ultimate expressions of free-market capitalism, hailed by business and political leaders alike as examples of what was possible when government got out of the way and let innovation bloom.

There lies the great irony in this pivotal turn in American high-tech history. For even though “the mother of all demos” provided inspiration for a personal, entrepreneurial, government-is-dangerous-and-small-is-beautiful computing era, Doug Engelbart’s audacious vision would never have made it to keyboard and mouse without government research funding in the first place.

Engelbart was keenly aware of this, flashing credits up on the screen at the presentation’s start listing those who funded his research team: the Defense Department’s Advanced Projects Research Agency, later known as DARPA; the National Aeronautics and Space Administration; the U.S. Air Force. Only the public sector had the deep pockets, the patience and the tolerance for blue-sky ideas without any immediate commercial application.

Although government funding played a less visible role in the high-tech story after 1968, it continued to function as critical seed capital for next-generation ideas. Marc Andreessen and his fellow graduate students developed their groundbreaking web browser in a government-funded university laboratory. DARPA and NASA money helped fund the graduate research project that Sergey Brin and Larry Page would later commercialize as Google. Driverless car technology got a jump-start after a government-sponsored competition; so has nanotechnology, green tech and more. Government hasn’t gotten out of Silicon Valley’s way; it remained there all along, quietly funding the next generation of boundary-pushing technology.

Today, public debate rages once again on Capitol Hill about computer-aided invasions of privacy. Hollywood spins apocalyptic tales of technology run amok. Americans spend days staring into screens, tracked by the smartphones in our pockets, hooked on social media. Technology companies are among the biggest and richest in the world. It’s a long way from Engelbart’s humble grocery list.

But perhaps the current moment of high-tech angst can once again gain inspiration from the mother of all demos. Later in life, Engelbart described his life’s work as a quest to “help humanity cope better with complexity and urgency.” His solution was a computer that was remarkably different from the others of that era, one that was humane and personal, that augmented human capability rather than boxing it in. And he was able to bring this vision to life because government agencies funded his work.

Now it’s time for another mind-blowing demo of the possible future, one that moves beyond the current adversarial moment between big government and Big Tech. It could inspire people to enlist public and private resources and minds in crafting the next audacious vision for our digital future.

This article originally appeared on The Conversation. Read the original.

CATEGORIZED UNDER: Technology
MORE ABOUT: computers
NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

ADVERTISEMENT

See More

ADVERTISEMENT
Collapse bottom bar
+