Black hole born without stellar parent, evidence suggests

A remote galaxy might harbor a type of black hole that arises directly from a massive cloud of gas rather than forming after the death of a star. This rare specimen could explain how some galaxies built gargantuan black holes in the first billion years or so after the Big Bang.

The galaxy, known as CR7, is unusual (SN: 7/25/2015, p. 8). It blasts out more ultraviolet radiation than other galaxies that lived at the same time, roughly 13 billion years ago (about 800 million years after the Big Bang). The gas in CR7 also appears to lack elements such as carbon and oxygen, which are forged within stars and then ejected into space. One idea is that CR7 is giving birth to first-generation stars, similar to the first stars ever created in the universe. Another hypothesis is that CR7 harbors the first known “direct collapse” black hole, one that forms when a blob of interstellar gas collapses under its own weight without first forming stars.
A black hole is more likely, suggest Aaron Smith of the University of Texas at Austin and colleagues in the Aug. 11 Monthly Notices of the Royal Astronomical Society. The researchers developed computer simulations that explore how interstellar gas interacts with the harsh radiation from primordial stars or a large black hole. Smith and colleagues find that the light from a cache of hot, young stars can’t explain why a parcel of gas is racing away from CR7 at about 580,000 kilometers per hour. What can push the gas, they report, is radiation from a superheated disk of debris swirling around a black hole roughly 100,000 times as massive as the sun.

If CR7 does host a black hole, it would be the first evidence of one forming out of clouds that haven’t given birth to stars yet. Astronomers struggle to explain how some supermassive black holes could form in about 1 billion years out of just smaller black holes merging together. “There’s just not enough time to do that,” Smith says. A direct collapse black hole, however, creates a massive seed all in one go, jump-starting the growth of a behemoth that will eventually weigh as much as several billion suns.

“This is definitely a good step forward,” says David Sobral, an astrophysicist at Lancaster University in England who discovered CR7 in 2015. But it’s too early to say whether a black hole or a group of stars is powering CR7, he says. “I’ve tried to stay a bit away from it and argue that what we need is new observations instead of taking sides.”

With the data that are available, it’s hard to distinguish between stars or a black hole, says Sobral. That’s why he and colleagues have reserved time with the Hubble Space Telescope in January and are awaiting new data from the Atacama Large Millimeter/submillimeter Array in Chile. Data from both observatories will help researchers look for traces of heavy elements in CR7. If these more sensitive data still show no sign of atoms such as carbon, says Sobral, then CR7 probably hosts a nest of first-generation stars. A black hole, on the other hand, probably would have formed long enough ago that there would be enough time for stars to form and pollute CR7 with a smidgen of heavy elements, he says.

A growing census of similar locales will help as well. “We’re now finding that CR7 is not alone,” Sobral says. He and his colleagues have since found four other galaxies comparable to CR7 in the early universe, results presented June 27 at the National Astronomy Meeting in Nottingham, England. “We don’t have to discuss one single thing,” he says, “but we can put [CR7] into a broader picture.”

Smart mice have better odds of survival

Pinky and The Brain‘s smarts might not be so far-fetched. Some mice are quicker on the uptake than others. While it might not lead to world domination, wits have their upside: a better shot at staying alive.

Biologists Audrey Maille and Carsten Schradin of the University of Strasbourg in France tested reaction time and spatial memory in 90 African striped mice (Rhabdomys pumilio) over the course of a summer. For this particular wild rodent, surviving harsh summer droughts means making it to mating season in the early fall.

The team saw some overall trends: Females were more likely to survive if they had quick reflexes, and males were more likely to survive if they had good spatial memory. Cognitive traits like reacting quickly and remembering the best places to hide are key to eluding predators during these tough times but may come with trade-offs for males and females. The results show that an individual mouse’s cognitive strengths are linked to its survival odds, suggesting that the pressure to survive can shape basic cognition, Maille and Schradin write August 3 in Biology Letters.

Mix of brain training, physical therapy can help paralyzed patients

Training the brain could give paraplegics more control over their bodies.

After a year working with devices that link machine to brain, people paralyzed by spinal cord injuries were able to regain some movement and feeling in their legs, Miguel Nicolelis and colleagues report August 11 in Scientific Reports.

The training included an assortment of therapies with futuristic-looking gizmos, including virtual reality goggles and robotic exoskeletons that fasten over the body. All patients were better able to sense pain and touch, and half had their diagnosis upgraded from complete to partial paralysis.
“This is a very key milestone in the field of brain-machine interfaces,” Nicolelis said in a news briefing August 9.

But some scientists aren’t convinced that it’s such an advance.

Researchers have already proposed (and demonstrated) the benefits of brain-machine interface technology before, says Brendan Allison, a neuroscientist at the University of California, San Diego. While the new work is impressive and stands out for such long-term brain training, he says, “it’s simply not a breakthrough.”

Paralyzing spinal cord injuries can sever the bundle of nerves that carries messages from the brain to the body. So even if the brain tells the toes to move, the message dead-ends when it hits the injury site. For people with these severe injuries, there’s not much doctors can do to help, said Nicolelis, of Duke University. “They basically just try to get you adapted to life in a wheelchair.”

In 2012, he and colleagues started a project to help paraplegics walk, with assistance. With a stretchy cap placed over their heads to capture brain signals, patients would use their thoughts to control an exoskeleton, a robotic walker that held their bodies upright. The device would give paraplegics some mobility again — at least that was the goal. What the team found was far more exciting, Nicolelis said.
For one year, Nicolelis’ team worked with eight people who had been paralyzed for between three and 13 years. Twice a week for one hour, patients trained on a variety of rehabilitation tools. They learned how to control an avatar in virtual reality by imagining moving their legs, and later, how to use the brain-controlled exoskeleton. Patients also did hours of traditional physical therapy, including strengthening and stretching exercises.
About seven months into the project, the researchers noticed that all patients were beginning to regain control of one or more muscles below their spinal cord injury. And after 12 months, all patients experienced improvements in sensing touch and pain. One woman even gave birth to a child, and could feel the contractions during the delivery, Nicolelis said.

Though all patients had severe spinal cord injuries, he thinks some nerve cells must have survived and that the training rekindled their activity.

The amount of clinical recovery “is almost like a dream,” Nicolelis said. It may be enough to upgrade brain-machine interfaces, he said, from an assistive therapy (something that helps people walk — like crutches or a cane) to something that actually that helps people recover.

Again, that’s not a new idea, says Donatella Mattia, a neurophysiologist at the Institute for Research and Health Care in Rome. Other scientists working with paralyzed stroke patients have already shown improvements in arm mobility after using brain-machine interfaces.

What’s more, teasing out cause and effect in the new study isn’t easy, says University of Houston neural engineer José Contreras-Vidal. Patients underwent a complicated blend of therapies, with different lengths of training. Despite the caveats, though, he thinks the work is a step forward. And in clinical research, he says, “even incremental is good.”

Bacteria-sized molecules created in lab

Scientists have created giant molecules — the size of bacteria — that may be useful in future quantum computers.

The molecules of unusual size are formed from pairs of Rydberg atoms — atoms with an electron that has been boosted into a high-energy state. Such electrons orbit far from their atom’s nucleus and, as a result, can feel the influence of faraway atoms.

To create the molecules, researchers cooled cesium atoms nearly to absolute zero, hitting them with lasers to form Rydberg atoms that bound together in pairs. These molecules are about one thousandth of a millimeter in size — a thousand times the size of a typical molecule — scientists report August 19 in Physical Review Letters.
“I think it’s fundamentally interesting and important because it’s such a curious thing,” says physicist David Petrosyan of the Institute of Electronic Structure & Laser at the Foundation for Research and Technology–Hellas in Heraklion, Greece. “The size of these molecules is huge.”

This is not the first time such molecules have been created, but the previous evidence was not clear-cut. “Before, maybe it wasn’t clear if this is really a molecule in the sense that it’s vibrating and rotating. It could have been just two atoms sitting therewith very weak interactions or no interactions,” says Johannes Deiglmayr, a physicist at ETH Zürich and a coauthor of the study.

Deiglmayr and collaborators measured the molecules’ binding energies — the energy that holds the two atoms together. Additionally, the scientists made detailed calculations to predict the molecules’ properties. These calculations were “extensive and seemed to match really well with their measurements,” says physicist Phillip Gould of the University of Connecticut in Storrs.

The result has practical implications, Petrosyan notes. In quantum computers that use atoms as quantum bits, scientists perform computations by allowing atoms to interact. Rydberg atoms can interact with their neighbors over long distances, and when bound together, the atoms stay put at a consistent distance from one another — a feature that may improve the accuracy of calculations.

Previously, researchers have used rubidium atoms to make another type of large molecule, formed from Rydberg atoms bonded with normal atoms. But these wouldn’t be useful for quantum computation, Petrosyan says, as they rely on a different type of bonding mechanism.

How one scientist’s gut microbes changed over a year

Where you live and what you eat can rapidly affect the types of friendly bacteria inhabiting your body. To see how the microbes that inhabit the mouth and intestines change over time, Duke University computational biologist Lawrence David zealously chronicled his microbiome for an entire year. (For more on David and this experiment, see “Lawrence David’s gut check gets personal.”)

A stream plot (below, top graph) shows the ebb and flow of phyla of bacteria in his gut over time. The thickness of each stream indicates a bacterial group’s relative abundance in daily fecal samples.
David peered closer at the data in a horizon plot (above, bottom graph; colored squares at left indicate the phylum of the bacteria represented in each row). He first determined each type of bacteria’s normal abundance in his gut, then calculated how much they differed from the median abundance. Warmer colors (red, orange, yellow) indicate that bacteria in that group increased in abundance, and cooler colors (blue, green) indicate a decrease in abundance. Living abroad from day 71 to day 122 had a dramatic — but short-lived — effect on David’s microbiome.

Sugar industry sought to sugarcoat causes of heart disease

Using records unearthed from library storage vaults, researchers recently revealed that the sugar industry paid nutrition experts from Harvard University to downplay studies linking sugar and heart disease. Although the incident happened in the 1960s, it appears to have helped redirect the scientific narrative for decades.

The documents — which include correspondence, symposium programs and annual reports — show that the Sugar Research Foundation (as it was named at the time) paid professors who wrote a two-part review in 1967 in the New England Journal of Medicine. That report was highly skeptical of the evidence linking sugar to cardiovascular problems but accepting of the role of fat. The now-deceased professors’ overall conclusion left “no doubt” that reducing the risk of heart disease was a matter of reducing saturated fat and cholesterol, according to researchers from the University of California, San Francisco, who published their report online September 12 in JAMA Internal Medicine.

“Why does it matter today? The sugar industry helped deflect the way the research was developing,” says study coauthor Cristin Kearns, a dentist at UCSF’s Institute for Health Policy Studies. The Harvard team’s scientific favoritism had a role in directing research and policy attention toward fat and cholesterol. And in fact, the first dietary guidelines published by the federal government in 1980 said there was no convincing evidence that sugar causes heart disease, stating “the major health hazard from too much sugar is tooth decay.”
Following the publication of the Harvard report, fat and cholesterol went on to hijack the scientific agenda for decades, and even led to a craze of low-fat foods that often added sugar. Kearns points out that it was only in 2015 that dietary guidelines finally made a strong statement to limit sugar. Researchers writing this year in Progress in Cardiovascular Diseases note that current studies estimate that diets high in added sugars carry a three times higher risk of death from cardiovascular disease. (For its part, the Sugar Association says in a statement on its website that “the last several decades of research have concluded that sugar does not have a unique role in heart disease.”)

The level at which the food industry continues to influence nutrition research is still a much-debated topic. The Sugar Association’s statement acknowledged the secret deal occurred, but pointed out that “when the studies in question were published, funding disclosures and transparency standards were not the norm they are today.” Journals now require all authors to list conflicts of interest, especially funding from a source has a vested interest in the outcome.

That doesn’t mean that trade groups and industry associations no longer have an influence on scientists, says Andy Bellatti, cofounder and strategic director of Dietitians for Professional Integrity, which has campaigned to push the Academy of Nutrition and Dietetics to sever its ties with industry, While a modern researcher could not take corporate money, even for speaking fees, without disclosure, the influences may be more subtle, he says. “We’re not talking about making up data, but perhaps influencing how a research question is framed.”

In a commentary published with the JAMA study, Marion Nestle, a nutrition researcher at New York University, wrote that industry influence has not disappeared. She cited recent New York Times investigations of Coca-Cola–sponsored research and Associated Press stories revealing that a candy trade group sponsored research attempting to show that children who eat sweets have a healthy body weight.

Bellatti says that researchers don’t necessarily want to be cozy with industry, but sometimes turn to commercial sources because non-biased research money is lacking. “The reason the food industry is able to do this is because there is such little public funding for nutrition and disease,” Bellatti says.
For that reason, the scientific community should not reject industry money wholesale, says John Sievenpiper, a physician and nutrition researcher at the University of Toronto. A study of his was once ridiculed on Nestle’s blog because the disclosures covered two full pages. He believes that any scientist who takes industry money should adhere to an even higher standard of openness, including releasing study protocols ahead of time so reviewers can make sure the research question was not changed midstream to favor a certain conclusion.

While many parallels have been made between the food and tobacco industries, Sievenpiper believes those comparisons miss the complicated nature of the human diet. Tobacco is always bad, never good. Sugar, fat, cholesterol and other components of diet are some of both, making research into their effects much more nuanced, he says. And unlike with tobacco, the solution can’t be to never eat them. He believes solutions won’t involve turning single nutrients like fat or sugar into villains, but promoting better overall patterns of eating, like the Mediterranean diet.

Kearns, who has spent the past 10 years looking into the sugar industry’s influence on science, isn’t finished yet. She says her curiosity first arose during a conference on gum disease and diabetes in 2007, when she noticed a lack of scientific discussion of sugar. She started out simply Googling industry influences. The trail eventually led her to scour library archives, until she came across dusty boxes of records from a closed sugar company in Colorado. “The first page I looked at in that archive had a confidential memo,” she says. “I knew I had something no one else had never talked about before.”

She doesn’t see the research going sour any time soon. “This was their 226th project in 1965,” she says. “There’s a lot more to the story.”

CT scans show first X-rayed mummy in new light

X-rays were the iPhone 7 of the 1890s. Months after X-rays were discovered in late 1895, German physicist Walter Koenig put the latest in tech gadgetry to the test by scanning 14 objects, including the mummified remains of an ancient Egyptian child. Koenig’s image of the child’s knees represented the first radiographic investigation of a mummy.

At the time, details on the mummy itself were scant. Originally collected by explorer-naturalist Eduard Rueppell in 1817, the specimen lacked any sort of decoration that might link it to a particular dynasty or time period. Koenig’s X-ray image of the mummy served less to fill in any of those blanks and more to demonstrate the technology’s potential. Since then, radiographic images have revealed hidden artifacts, elucidated embalming techniques and even pinpointed health issues and diseases in mummies.
Now, biological anthropologist and Egyptologist Stephanie Zesch of the Reiss Engelhorn Museum in Mannheim, Germany, and colleagues have examined the mummy with modern imaging techniques. CT scans show that the child was a boy. His teeth suggest that he was 4 to 5 years old when he died. Radiocarbon dating places him in the Ptolemaic period, between 378 and 235 B.C., the researchers report online July 22 in the European Journal of Radiology Open.
The team also diagnosed a slew of health conditions: a common chest wall deformity called pectus excavatum, or sunken chest; bone density marks called Harris lines in his leg bones that indicate physiological stress; and an enlarged liver. The team attributes the distended liver to a parasitic infection like schistosomiasis, which is common in Egypt and sometimes lethal. Without any obvious signs of trauma, however, “it’s impossible to determine cause of death,” Zesch says.

Even with the all-seeing power of today’s CT scans, the culprit behind the boy’s demise remains under wraps.

There’s a new way to stop an earthquake: put a volcano in its path

Editor’s note: Science has retracted the study described in this article. The May 3, 2019, issue of the journal notes that a panel of outside experts convened by Kyoto University in Japan concluded in March 2019 that the paper contained falsified data, manipulated images and instances of plagiarism, and that these were the responsibility of lead author Aiming Lin, a geophysicist at Kyoto University. In agreement with the investigation’s recommendation, the authors withdrew the report.

A titanic volcano stopped a mega-sized earthquake in its tracks.

In April, pent-up stress along the Futagawa-Hinagu Fault Zone in Japan began to unleash a magnitude 7.1 earthquake. The rupture traveled about 30 kilometers along the fault until it reached Mount Aso, one of Earth’s largest active volcanoes. That’s where the quake met its demise, geophysicist Aiming Lin of Kyoto University in Japan and colleagues report online October 20 in Science. The quake moved across the volcano’s caldronlike crater and abruptly stopped, the researchers found.

Geophysical evidence suggests that a region of rising magma lurks beneath the volcano. This magma chamber created upward pressure plus horizontal stresses that acted as an impassable roadblock for the seismic slip powering the quake, the researchers propose. This rare meetup, the researchers warn, may have undermined the structural integrity surrounding the magma chamber, increasing the likelihood of an eruption at Aso.

Young planets carve rings and spirals in the gas around their suns

Growing planets carve rings and spiral arms out of the gas and dust surrounding their young stars, researchers report in three papers to be published in Astronomy & Astrophysics. And dark streaks radiating away from the star in one of the planet nurseries appear to be shadows cast onto the disk by the clumps of planet-building material close to the star. This isn’t the first time that astronomers have spied rings around young stars, but the new images provide a peek at what goes into building diverse planetary systems.

The three stars — HD 97048, HD 135344B and RX J1615.3-3255 — are all youthful locals in our galaxy. They sit between 460 and 600 light-years away; the oldest is roughly a mere 8 million years old. All the stars have been studied before. But now three teams of researchers have used a new instrument at the Very Large Telescope in Chile to see extra-sharp details in the planet construction zone around each star.

The new instrument, named SPHERE, was designed to record images, spectra and polarimetry (the orientations of light waves) of young exoplanet families. Flexible mirrors within the instrument adapt to atmospheric turbulence above the telescope, and a tiny disk blocks light from the star, allowing faint details around the star to come into view.

There’s something cool about Arctic bird poop

Seabird poop helps the Arctic keep its cool, new research suggests.

The droppings release ammonia into the atmosphere, where it reacts with other chemicals in the air to form small airborne particles. Those particles form the heart of cloud droplets that reflect sunlight back into space, researchers propose November 15 in Nature Communications.

Even though the poop’s presence provides only modest cooling, understanding the effect could help scientists better predict how the region will fare under future climate change, says study coauthor Greg Wentworth. “The humor is not lost on me,” says Wentworth, an atmospheric chemist at Alberta Environment and Parks in Canada. “It’s a crucial connection, albeit somewhat comical.”
Arctic air temperatures are rising about twice as fast as temperatures in lower latitudes (SN: 12/26/15, p. 8), a shift that could threaten ecosystems and alter global weather patterns. Scientists still don’t fully understand Arctic climate, though.

Earlier this year, Wentworth and colleagues reported finding surprisingly abundant ammonia in Arctic air. They linked the chemical to the guano of the tens of millions of seabirds that flock to the frigid north each summer. Bacteria in the Arctic dine on the feces and release about 40,000 metric tons of ammonia annually. (The smell, Wentworth says, is awful.)

Once in the atmosphere, that ammonia reacts with sulfuric acid and water to form small particles that increase the number of cloud droplets, the researchers now propose. A cloud made up of a lot of smaller droplets will have more surface area and reflect more sunlight than a cloud made up of fewer but larger droplets.

This effect causes on average about 0.5 watts of summertime cooling per square meter in the Arctic, with more than a watt of cooling per square meter in some areas, the researchers estimate using a simulation of the Arctic’s atmospheric chemistry. For comparison, the natural greenhouse effect causes about 150 watts of warming per square meter worldwide. On top of that, carbon dioxide from human activities currently contributes about 1.6 watts per square meter of warming on average.

“Birds are in the equation now” when it comes to cloud formation, says Ken Carslaw, an atmospheric scientist at the University of Leeds in England. Understanding how climate change and human activities in the Arctic impact seabirds could be important to forecasting future temperature changes in the region, he says.