If you think the Amazon jungle is completely wild, think again

Welcome to the somewhat civilized jungle. Plant cultivation by native groups has shaped the landscape of at least part of South America’s Amazon forests for more than 8,000 years, researchers say.

Of dozens of tree species partly or fully domesticated by ancient peoples, 20 kinds of fruit and nut trees still cover large chunks of Amazonian forests, say ecologist Carolina Levis of the National Institute for Amazonian Research in Manaus, Brazil, and colleagues. Numbers and variety of domesticated tree species increase on and around previously discovered Amazonian archaeological sites, the scientists report in the March 3 Science.
Domesticated trees are “a surviving heritage of the Amazon’s past inhabitants,” Levis says.

The new report, says archaeologist Peter Stahl of the University of Victoria in Canada, adds to previous evidence that “resourceful and highly developed indigenous cultures” intentionally altered some Amazonian forests.

Southwestern and northwestern Amazonian forests contain the greatest numbers and diversity of domesticated tree species, Levis’ team found. Large stands of domesticated Brazil nut trees remain crucial resources for inhabitants of southwestern forests today.

Over the past 300 years, modern Amazonian groups may have helped spread some domesticated tree species, Levis’ group says. For instance, 17th century v­oyagers from Portugal and Spain established plantations of cacao trees in southwestern Amazonian forests that exploited cacao trees already cultivated by local communities, the scientists propose.

Their findings build on a 2013 survey of forest plots all across the Amazon led by ecologist and study coauthor Hans ter Steege of the Naturalis Biodiversity Center in Leiden, the Netherlands. Of approximately 16,000 Amazonian tree species, just 227 accounted for half of all trees, the 2013 study concluded.
Of that number, 85 species display physical features signaling partial or full domestication by native Amazonians before European contact, the new study finds. Studies of plant DNA and plant remains from the Amazon previously suggested that domestication started more than 8,000 years ago. Crucially, 20 domesticated tree species — five times more than the number expected by chance — dominate their respective Amazonian landscapes, especially near archaeological sites and rivers where ancient humans likely congregated, Levis’ team says.

Archaeologists, ecologists and crop geneticists have so far studied only a small slice of the Amazon, which covers an area equivalent to about 93 percent of the contiguous U.S. states.

Levis and her colleagues suspect ancient native groups domesticated trees and plants throughout much of the region. But some researchers, including ecologist Crystal McMichael of the University of Amsterdam, say it’s more likely that ancient South Americans domesticated trees just in certain parts of the Amazon. In the new study, only Brazil nut trees show clear evidence of expansion into surrounding forests from an area of ancient domestication, McMichael says. Other tree species may have mainly been domesticated by native groups or Europeans in the past few hundred years, she says.

Most Americans like science — and are willing to pay for it

Americans don’t hate science. Quite the contrary. In fact, 79 percent of Americans think science has made their lives easier, a 2014 Pew Research Center survey found. More than 60 percent of people also believe that government funding for science is essential to its success.

But should the United States spend more money on scientific research than it already does? A layperson’s answer to that question depends on how much that person thinks the government already spends on science, a new study shows. When people find out just how much — or rather, how little — of the federal budget goes to science, support for more funding suddenly jumps.

To see how people’s opinions of public science spending were influenced by accurate information, mechanical engineer Jillian Goldfarb and political scientist Douglas Kriner, both at Boston University, placed a small experiment into the 2014 Cooperative Congressional Election Study. The online survey was given to 1,000 Americans, carefully selected to represent the demographics of the United States. The questions were designed to be nonpartisan, and the survey itself was conducted in 2014, long before the 2016 election.

The survey was simple. First, participants were asked to estimate what percentage of the federal budget was spent on scientific research. Once they’d guessed, half of the participants were told the actual amount that the federal government allocates for nondefense spending on research and development. In 2014, that figure was 1.6 percent of the budget, or about $67 billion. Finally, all the participants were asked if federal spending on science should be increased, decreased or kept the same.

The majority of participants had no idea how much money the government spends on science, and wildly overestimated the actual amount. About half of the respondents estimated federal spending for research at somewhere between 5 and 20 percent of the budget. A quarter of participants estimated that figure was 20 percent of the budget — one very hefty chunk of change. The last 25 percent of respondents estimated that 1 to 2 percent of federal spending went to science.

When participants received no information about how much the United States spent on research, only about 40 percent of them supported more funding. But when they were confronted with the real numbers, support for more funding leapt from 40 to 60 percent.

Those two numbers hover on either side of 50 percent, but Kriner notes, “media coverage [would] go from ‘minority’ to ‘majority’” in favor of more funding — a potentially powerful message. What’s more, the support for science was present in Democrats and Republicans alike, Kriner and Goldfarb report February 1 in Science Communication.
“I think it contributes to our understanding of the aspects of federal spending that people don’t understand very well,” says Brendan Nyhan, a political scientist at Dartmouth University in Hanover, N.H. It’s not surprising that most people don’t know how much the government is spending on research. Nyhan points out that most people probably don’t know how much the government spends on education or foreign aid either.

When trying to gather more support for science funding, Goldfarb says, “tell people how little we spend, and how much they get in return.” Science as a whole isn’t that controversial. No one wants to stop exploring the universe or curing diseases, after all.

But when people — whether politicians or that guy in your Facebook feed — say they want to cut science funding, they won’t be speaking about science as a whole. “There’s a tendency to overgeneralize” and use a few controversial or perceived-to-be-wasteful projects to stand in for all of science, Nyhan warns.

When politicians want to cut funding, he notes, they focus on specific controversial studies or areas — such as climate change, stem cells or genetically modified organisms. They might highlight studies that seem silly, such as those that ended up in former Senator Tom Coburn’s “Wastebook,” (a mantle now taken up Senator Jeff Flake). Take those issues to the constituents, and funding might end up in jeopardy anyway.

Kriner hopes their study’s findings might prove useful even for controversial research areas. “One of the best safeguards against cuts is strong public support for a program,” he explains. “Building public support for science spending may help insulate it from budget cuts — and our research suggests a relatively simple way to increase public support for scientific research.”

But he worries that public support may not stay strong if science becomes too much of a political pawn. The study showed that both Republicans and Democrats supported more funding for science when they knew how little was spent. But “if the current administration increases its attacks on science spending writ large … it could potentially politicize federal support for all forms of scientific research,” Kriner says. And the stronger the politics, the more people on both sides of the aisle grow resistant to hearing arguments from the other side.

Politics aside, Goldfarb and Kriner’s data show that Americans really do like and support science. They want to pay for it. And they may even want to shell out some more money, when they know just how little they already spend.

Trackers may tip a warbler’s odds of returning to its nest

Strapping tiny trackers called geolocators to the backs of birds can reveal a lot about where the birds go when they migrate, how they get there and what happens along the way. But ornithologists are finding that these cool backpacks could have not-so-cool consequences.

Douglas Raybuck of Arkansas State University and his colleagues outfitted some Cerulean warblers (Setophaga cerulea) with geolocators and some with simple color tags to test the effects the locators might have on breeding and reproduction. This particular species globe-trots from its nesting grounds in the eastern United States to wintering grounds in South America and back each year. While the backpacks didn’t affect reproduction, birds wearing the devices were less likely than those wearing tags to return to the same breeding grounds the next year. The birds may have gotten off track, cut their trips short or died, possibly due to extra weight or drag from the backpack, the team reports May 3 in The Condor.

The study adds to conflicting evidence that geolocators affect some birds in negative ways, such as altering their breeding biology. At best, potential downsides vary from bird to bird and backpack to backpack. But that shouldn’t stop researchers from using geolocators to study migrating birds, the researchers argue, because the devices pinpoint areas crucial to migrating birds and can aid in conservation efforts.

Flight demands may have steered the evolution of bird egg shape

The mystery of why birds’ eggs come in so many shapes has long been up in the air. Now new research suggests adaptations for flight may have helped shape the orbs.

Stronger fliers tend to lay more elongated eggs, researchers report in the June 23 Science. The finding comes from the first large analysis of the way egg shape varies across bird species, from the almost perfectly spherical egg of the brown hawk owl to the raindrop-shaped egg of the least sandpiper.
“Eggs fulfill such a specific role in birds — the egg is designed to protect and nourish the chick. Why there’s such diversity in form when there’s such a set function was a question that we found intriguing,” says study coauthor Mary Caswell Stoddard, an evolutionary biologist at Princeton University.

Previous studies have suggested many possible advantages for different shapes. Perhaps cone-shaped eggs are less likely to roll out of the nest of cliff-dwelling birds; spherical eggs might be more resilient to damage in the nest. But no one had tested such hypotheses across a wide spectrum of birds.

Stoddard and her team analyzed almost 50,000 eggs from 1,400 species, representing about 14 percent of known bird species. The researchers boiled each egg down to its two-dimensional silhouette and then used an algorithm to describe each egg using two variables: how elliptical versus spherical the egg is and how asymmetrical it is — whether it’s pointier on one end than the other.

Next, the researchers looked at the way these two traits vary across the bird family tree. One pattern jumped out: Species that are stronger fliers, as measured by wing shape, tend to lay more elliptical or asymmetrical eggs, says study coauthor L. Mahadevan, a mathematician and biologist at Harvard University.
Mahadevan cautions that the data show only an association, but the researchers propose one possible explanation for the link between flying and egg shape. Adapting to flight streamlined bird bodies, perhaps also narrowing the reproductive tract. That narrowing would have limited the width of an egg that a female could lay. But since eggs provide nutrition for the chick growing inside, shrinking eggs too much would deprive the developing bird. Elongated eggs might have been a compromise between keeping egg volume up without increasing girth, Stoddard suggests. Asymmetry can increase egg volume in a similar way.

Testing a causal connection between flight ability and egg shape is tough “because of course we can’t replay the whole tape of life again,” says Claire Spottiswoode, a zoologist at the University of Cambridge who wrote a commentary accompanying the study. Still, Spottiswoode says the evidence is compelling: “It’s a very plausible argument.”

Santiago Claramunt, associate curator of ornithology at the Royal Ontario Museum in Toronto, isn’t convinced that flight adaptations played a driving role in the evolution of egg shape. “Streamlining in birds is determined more by plumage than the shape of the body — high performing fliers can have rounded, bulky bodies” he says, which wouldn’t give elongated eggs the same advantage over other egg shapes. He cites frigate birds and swifts as examples, both of which make long-distance flights but have fairly broad bodies. “There’s certainly more going on there.”

Indeed, some orders of birds showed a much stronger link between flying and egg shape than others did. And while other factors — like where birds lay their eggs and how many they lay at once — weren’t significantly related to egg shape across birds as a whole, they could be important within certain branches of the bird family tree.

Baby-led weaning won’t necessarily ward off extra weight

When my younger daughter was around 6 months old, we gave her mashed up prune. She grimaced and shivered a little, appearing to be absolutely disgusted. But then she grunted and reached for more.

Most babies are ready for solid food around 6 months of age, and feeding them can be fun. One of the more entertaining approaches does not involve a spoon. Called baby-led weaning, it involves allowing babies to feed themselves appropriate foods.

Proponents of the approach say that babies become more skilled eaters when allowed to explore on their own. They’re in charge of getting food into their own mouths, gumming it and swallowing it down — all skills that require muscle coordination. When the right foods are provided (yes to soft steamed broccoli; no to whole grapes), babies who feed themselves are no more likely to choke than their spoon-fed peers.

Some baby-led weaning proponents also suspected that the method might ward off obesity, and a small study suggested as much. The idea is that babies allowed to feed themselves might better learn how to regulate their food intake, letting hunger and fullness guide them to a reasonable calorie count. But a new study that looked at the BMIs of babies who fed themselves and those who didn’t found that babies grew similarly with either eating style.

A clinical trial of about 200 mother-baby pairs in New Zealand tracked two different approaches to eating and their impact on weight. Half of the moms were instructed to feed their babies as they normally would, which for most meant spoon-feeding their babies purees, at least early on. The other half was instructed that only breast milk or formula was best until 6 months of age, and after that, babies could be encouraged to feed themselves. These mothers also received breastfeeding support.

At the 1- and 2-year marks, the babies’ average BMI z-scores were similar, regardless of feeding method, researchers report July 10 in JAMA Pediatrics. (A BMI z-score takes age and sex into account.) And baby-led weaning actually produced slightly more overweight babies than the other approaches, but not enough to be meaningful. At age 2, 10.3 percent of baby-led weaning babies were considered overweight and 6.4 percent of traditionally-fed babies were overweight. The two groups of babies seemed to take in about the same energy from food, analyses of the nutritional value and amount of food eaten revealed.

The trial found a few other differences between the two groups. Babies who did baby-led weaning exclusively breastfed for longer, a median of about 22 weeks. Babies in the other group were exclusively breastfed for a median of about 17 weeks. Babies in the baby-led weaning group were also more likely to have held off on solid food until 6 months of age.

While baby-led weaning may not protect babies against being overweight, the study did uncover a few perks of the approach. Parents reported that babies who fed themselves seemed less fussy about foods. These babies also reportedly enjoyed eating more (though my daughter’s prune fake-out face is evidence that babies’ inner opinions can be hard to read). Even so, these data seem to point toward a more positive experience all around when using the baby-led weaning approach. That’s ideal for both experience-hungry babies and the parents who get to savor watching them eat.

This newfound hermit crab finds shelter in corals, not shells

A new species of hermit crab discovered in the shallow waters of southern Japan has been enjoying the perks of living like a peanut worm. Like the worms, the 7- to 8-millimeter-long hermit crab uses corals as a covering, researchers report September 20 in PLOS ONE.

Other kinds of hermit crabs live in coral reefs, but typically move in and out of a series of mollusk shells as the crabs grow. Diogenes heteropsammicola is the first hermit crab known to form a mutually beneficial relationship with two species of mobile corals called walking corals. Unlike more familiar coral species, these walking corals don’t grow in colonies and aren’t attached to the seafloor. Instead, each host coral grows with and around a crab, forming a cavity in the coral skeleton that provides a permanent home for the crustacean. In exchange, the crab helps the coral “walk.”
Walking corals are already known to be in a symbiotic relationship with a different sea creature — flexible, marine peanut worms called sipunculids. A symbiotic shift between such distantly related species as the worms and the crab is rare because organisms in a mutualistic relationship tend to be specialized and completely dependent on one other, says study coauthor Momoko Igawa, an ecologist at Kyoto University in Japan.
But similar to the worms, D. heteropsammicola appears to be well-adapted to live in the corals. Its extra slim body can slip inside the corals’ narrow cavity. And unlike other hermit crabs — whose tails curve to the right to fit into spiral shells — D.heteropsammicola’ s tail is symmetrical and can curl either way, just like the corals’ opening.
“Being able to walk around in something that is going to grow larger as you grow larger, that’s a big plus,” says Jan Pechenik, a biologist at Tufts University in Medford, Mass., who was not involved in the study. A typical hermit crab that can’t find a larger shell to move into “really is in trouble.”

D. heteropsammicola’s relationship with walking corals may begin in a similar way as it does with sipunculan worms, Igawa says. A walking coral larva latches onto a tiny mollusk shell containing a juvenile hermit crab and starts to grow. When the hermit crab outgrows the shell, the crustacean moves into the readily available host coral’s crevice, and the shell remains encapsulated in the coral.

By observing the hermit crab in an aquarium, Igawa and coauthor Makoto Kato, also an ecologist at Kyoto University, determined that the crab provides the corals with the same services as the worms: transportation and preventing the corals from being overturned by currents or buried in sediment.

Igawa hopes to search for this new hermit crab in Indonesia, a region where walking corals are normally found. Plus, because walking coral fossils are easy to come by in Japan, she also wants “to reveal the evolutionary history of the symbioses of walking corals [with] sipunculans and hermit crabs by observing these fossils.”

Seeing an adult struggle before succeeding inspires toddlers to persevere too

I recently wrote about the power that adults’ words can have on young children. Today, I’m writing about the power of adults’ actions. Parents know, of course, that their children keep a close eye on them. But a new study provides a particularly good example of a watch-and-learn moment: Toddlers who saw an adult struggle before succeeding were more likely to persevere themselves.

Toddlers are “very capable learners,” says study coauthor Julia Leonard, a cognitive developmental psychologist at MIT. Scientists have found that these youngsters pick up on abstract concepts and new words after just a few exposures. But it wasn’t clear whether watching adults’ actions would actually change the way toddlers tackle a problem.

To see whether toddlers could soak up an adult’s persistence, Leonard and her colleagues tested 262 13- to 18-month-olds (the average age was 15 months). Some of the children watched an experimenter try to retrieve a toy stuck inside a container. In some cases, the experimenter quickly got the toy out three times within 30 seconds — easy. Other times, the experimenter struggled for the entire 30 seconds before finally getting the toy out. The experimenter then repeated the process for a different problem, removing a carabiner toy from a keychain. Some kids didn’t see any experimenter demonstration.

Just after watching an adult struggle (or not), the toddlers were given a light-up cube. It had a big, useless button on one side. Another button — small and hidden — actually controlled the lights. The kids knew the toy could light up, but didn’t know how to turn the lights on.

Though the big button did nothing, that didn’t stop the children from poking it. But here’s the interesting part: Compared with toddlers who had just watched an adult succeed effortlessly, or not watched an adult do anything at all, the toddlers who had seen the adult struggle pushed the button more. These kids persisted, even though they never found success.

The sight of an adult persevering nudged the children toward trying harder themselves, the researchers conclude in the Sept. 22 Science. Leonard cautions that it’s hard to pull parenting advice from a single laboratory-based study, but still, “there may be some value in letting children see you work hard to achieve your goals,” she says.

Observing the adults wasn’t the only thing that determined the toddlers’ persistence, not by a long shot. Some kids might simply be more tenacious than others. In the experiments, some of the children who didn’t see an experimenter attempt a task, or who saw an experimenter quickly succeed, were “incredibly gritty,” Leonard says. And some of the kids who watched a persistent adult still gave up quickly themselves. That’s not to mention the fact that these toddlers were occasionally tired, hungry and cranky, all of which can affect whether they give up easily. Despite all of this variation, the copycat effect remained, so that kids were more likely to persist when they had just seen a persistent adult.

As Leonard says, this is just one study and it can’t explain the complex lives of toddlers. Still, one thing is clear, and it’s something that we would all do well to remember: “Infants are watching your behavior attentively and actively learning from what you do,” Leonard says.

These spiders may have the world’s fastest body clocks

WASHINGTON, D.C. — If it takes you a while to recover from a few lost hours of sleep, be grateful you aren’t an orb weaver.

Three orb-weaving spiders — Allocyclosa bifurca, Cyclosa turbinata and Gasteracantha cancriformis — may have the shortest natural circadian rhythms discovered in an animal thus far, researchers reported November 12 at the Society for Neuroscience’s annual meeting.

Most animals have natural body clocks that run closer to the 24-hour day-night cycle, plus or minus a couple hours, and light helps reset the body’s timing each day. But the three orb weavers’ body clocks average at about 17.4, 18.5 and 19 hours respectively. This means the crawlers must shift their cycle of activity and inactivity — the spider equivalent of wake and sleep cycles — by about five hours each day to keep up with the normal solar cycle.
“That’s like flying across more than five time zones, and experiencing that much jet lag each day in order to stay synchronized with the typical day-night cycle,” said Darrell Moore, a neurobiologist at East Tennessee State University in Johnson City.

“Circadian clocks actually keep us from going into chaos,” he added. “Theoretically, [the spiders] should not exist.”

For most animals, internal clocks help them perform recurring daily activities, like eat, sleep and hunt, at the most appropriate time of day. Previous studies have shown that animals that are out of sync with the 24-hour solar cycle are usually less likely to produce healthy offspring than those that aren’t.
Moore and his colleagues were surprised to find the short circadian clocks while studying aggression and passivity in an orb weaver. The team was trying to see if there was a circadian component to these predatory and preylike behaviors, when they discovered C. turbinata’s exceptionally short circadian cycles. “I was looking at this thing thinking, ‘That can’t be right,’” said Moore.

To measure spiders’ natural biological clocks without the resetting effect of the sun, the researchers placed 18 species of spiders in constant darkness and monitored their motion. Three orb weaver species in particular had incredibly short cycles of activity and inactivity.

As far as the researchers know, the short cycle does not seem to be a problem for these spiders. In fact, it might even be useful. Short clocks may help these orb weavers avoid becoming the proverbial “worm” to the earliest birds, the researchers hypothesize. Since the spiders become more active at dusk and begin spinning their webs three to five hours before dawn, they can avoid predators that hunt in the day.

Throughout the day, the spiders remain motionless on their web, prepared to pounce on their next meal. By midday, the spiders’ truncated circadian clocks should have reset, sparking a new round of activity. But in the five to seven hours of daylight left, the orb weavers remain inactive. It’s hard to say whether the spiders are actually resting, Moore said. The researchers suspect that light may delay the onset of another short circadian cycle each day, helping the spiders stay synchronized with the 24-hour environmental cycle.

“The method or molecular mechanism will be really fascinating to figure out,” said Sigrid Veasey, a neuroscientist at the University of Pennsylvania’s Perelman School of Medicine. She is curious to know how the spiders can be so far off from “normal” circadian periods, and still be able to match their activity to the 24-hour light and dark cycle.

Determining the differences between short and normal period clocks in spiders may help researchers find out why and how different circadian clocks are suited to the particular environmental challenges of each species, Moore said.

Blowflies use drool to keep their cool

SAN FRANCISCO — Blowflies don’t sweat, but they have raised cooling by drooling to a high art.

In hot times, sturdy, big-eyed Chrysomya megacephala flies repeatedly release — and then retract — a droplet of saliva, Denis Andrade reported January 4 at the annual meeting of the Society for Integrative and Comparative Biology. This process isn’t sweating. Blowfly droplets put the cooling power of evaporation to use in a different way, said Andrade, who studies ecology and evolution at the Universidade Estadual Paulista in Rio Claro, Brazil.
As saliva hangs on a fly’s mouthparts, the droplet starts to lose some of its heat to the air around it. When the fly droplet has cooled a bit, the fly then slurps it back in, Andrade and colleagues found. Micro-CT scanning showed the retracted droplet in the fly’s throatlike passage near the animal’s brain. The process eased temperatures in the fly’s body by about four degrees Celsius below ambient temps. That may be preventing dangerous overheating, he proposed. The same droplet seemed to be released, cooled, drawn back in and then released again several times in a row.

Andrade had never seen a report of this saliva droplet in-and-out before he and a colleague noticed it while observing blowfly temperatures for other reasons. But in 2012, Chloé Lahondère and a colleague described how Anopheles stephensi mosquitoes that exude a liquid droplet that dangles and cools, but at the other end of the animals.

Mosquitoes, which let their body temperatures float with that of their environment, can get a heat rush when drinking from warm-blooded mammals. While drinking, the insects release a blood-tinged urine droplet, which dissipates some of the heat. There’s some fluid movement within the droplet, says Lahondère, now at Virginia Tech in Blacksburg, but whether any of the liquid gets recaptured by the body the way fly drool is, she can’t say.

Somewhere in the brain is a storage device for memories

People tend to think of memories as deeply personal, ephemeral possessions — snippets of emotions, words, colors and smells stitched into our unique neural tapestries as life goes on. But a strange series of experiments conducted decades ago offered a different, more tangible perspective. The mind-bending results have gained unexpected support from recent studies.

In 1959, James Vernon McConnell, a psychologist at the University of Michigan in Ann Arbor, painstakingly trained small flatworms called planarians to associate a shock with a light. The worms remembered this lesson, later contracting their bodies in response to the light.
One weird and wonderful thing about planarians is that they can regenerate their bodies — including their brains. When the trained flatworms were cut in half, they regrew either a head or a tail, depending on which piece had been lost. Not surprisingly, worms that kept their heads and regrew tails retained the memory of the shock, McConnell found. Astonishingly, so did the worms that grew replacement heads and brains. Somehow, these fully operational, complex arrangements of brand-spanking-new nerve cells had acquired the memory of the painful shock, McConnell reported.

In subsequent experiments, McConnell went even further, attempting to transfer memory from one worm to another. He tried grafting the head of a trained worm onto the tail of an untrained worm, but he couldn’t get the head to stick. He injected trained planarian slurry into untrained worms, but the recipients often exploded. Finally, he ground up bits of the trained planarians and fed them to untrained worms. Sure enough, after their meal, the untrained worms seemed to have traces of the memory — the cannibals recoiled at the light.

The implications were bizarre, and potentially profound: Lurking in that pungent planarian puree must be a substance that allowed animals to literally eat one another’s memories.

These outlandish experiments aimed to answer a question that had been needling scientists for decades: What is the physical basis of memory? Somehow, memories get etched into cells, forming a physical trace that researchers call an “engram.” But the nature of these stable, specific imprints is a mystery.
Today, McConnell’s memory transfer episode has largely faded from scientific conversation. But developmental biologist Michael Levin of Tufts University in Medford, Mass., and a handful of other researchers wonder if McConnell was onto something. They have begun revisiting those historical experiments in the ongoing hunt for the engram.

Applying powerful tools to the engram search, scientists are already challenging some widely held ideas about how memories are stored in the brain. New insights haven’t yet revealed the identity of the physical basis of memory, though. Scientists are chasing a wide range of possibilities. Some ideas are backed by strong evidence; others are still just hunches. In pursuit of the engram, some researchers have even searched for clues in memories that persist in brains that go through massive reorganization.
Synapse skeptics
One of today’s most entrenched explanations puts engrams squarely within the synapses, connections where chemical and electrical messages move between nerve cells, or neurons. These contact points are strengthened when bulges called synaptic boutons grow at the ends of message-sending axons and when hairlike protrusions called spines decorate message-receiving dendrites.

In the 1970s, researchers described evidence that as an animal learned something, these neural connections bulked up, forming more contact points of synaptic boutons and dendritic spines. With stronger connection points, cells could fire in tandem when the memory needed to be recalled. Stronger connections mean stronger memory, as the theory goes.

This process of bulking up, called long-term potentiation, or LTP, was thought to offer an excellent explanation for the physical basis of memory, says David Glanzman, a neuroscientist at UCLA who has studied learning and memory since 1980. “Until a few years ago, I implicitly accepted this model for memory,” he says. “I don’t anymore.”

Glanzman has good reason to be skeptical. In his lab, he studies the sea slug Aplysia, an organism that he first encountered as a postdoctoral student in the Columbia University lab of Nobel Prize–winning neuroscientist Eric Kandel. When shocked in the tail, these slugs protectively withdraw their siphon and gill. After multiple shocks, the animals become sensitized and withdraw faster. (The distinction between learning and remembering is hazy, so researchers often treat the two as similar actions.)

After neurons in the sea slugs had formed the memory of the shock, the researchers saw stronger synapses between the nerve cells that sense the shock and the ones that command the siphon and gill to move. When the memory was made, the synapse sprouted more message-sending bumps, Glanzman’s team reported in eLife in 2014. And when the memory was weakened with a drug that prevents new proteins from being made, some of these bumps disappeared. “That made perfect sense,” Glanzman says. “We expected the synaptic growth to revert and go back to the original, nonlearned state. And it did.”

But the answer to the next logical question was a curve ball. If the memory was stored in bulked-up synapses, as researchers thought, then the new contact points that appear when a memory is formed should be the same ones that vanish when the memory is subsequently lost, Glanzman reasoned. That’s not what happened — not even close. “We found that it was totally random,” Glanzman says. “Completely random.”
Research from Susumu Tonegawa, a Nobel Prize–winning neuroscientist at MIT, turned up a result in mice that was just as startling. His project relies on sophisticated tools, including optogenetics. With this technique, the scientists activated specific memory-storing neurons with light. The researchers call those memory-storing cells “engram cells.”

In a parallel to the sea slug experiment, Tonegawa’s team conditioned mice to fear a particular cage and genetically marked the cells that somehow store that fear memory. After the mice learned the association, the researchers saw more dendritic spines in the neurons that store the memory — evidence of stronger synapses.

When the researchers caused amnesia with a drug, that newly formed synaptic muscle went away. “We wiped out the LTP, completely wiped it out,” says neuroscientist Tomás Ryan, who conducted the study in Tonegawa’s lab and reported the results with Tonegawa and colleagues in 2015 in Science.

And yet the memory wasn’t lost. With laser light, the researchers could still activate the engram cells — and the memory they somehow still held. That means that the memory was stored in something that isn’t related to the strength of the synapses.

The surprising results suggest that researchers may have been sidetracked, focusing too hard on synaptic strength as a memory storage system, says Ryan, now at Trinity College Dublin. “That approach has produced about 12,000 or so papers on the topic, but it hasn’t been very successful in explaining how memory works.”

Silent memories
The finding that memory storage and synaptic strength aren’t always tied together raises an important distinction that may help untangle the research. The cellular machines that are required for calling up memories are not necessarily the same machines that store memories. The search for what stores memories may have been muddled with results on how cells naturally call up memories.

Ryan, Tonegawa and Glanzman all think that LTP, with its bulked-up synapses, is important for retrieving memories, but not the thing that actually stores them. It’s quite possible to have stored memories that aren’t readily accessible, what Glanzman calls “occult memories” and Tonegawa refers to as “silent engrams.” Both think the concept applies more broadly than in just the sea slugs and mice they study.

Glanzman explains the situation by considering a talented violin player. “If you cut off my hands, I’m not able to play the violin,” he says. “But it doesn’t mean that I don’t know how to play the violin.” The analogy is overly simple, he says, “but that’s how I think of synapses. They enable the memory to be expressed, but they are not where the memory is.”

In a paper published last October in Proceedings of the National Academy of Sciences, Tonegawa and colleagues created silent engrams of a cage in which mice received a shock. The mice didn’t seem to remember the room paired with the shock, suggesting that the memory was silent. But the memory could be called up after a genetic tweak beefed up synaptic connections by boosting the number of synaptic spines specifically among the neurons that stored the memory.

Story continues below diagram
Those results add weight to the idea that synaptic strength is crucial for memory recall, but not storage, and they also hint that, somehow, the brain stores many inaccessible memory traces. Tonegawa suspects that these silent engrams are quite common.

Finding and reactivating silent engrams “tells us quite a bit about how memory works,” Tonegawa says. “Memory is not always active for you. You learn it, you store it,” but depending on the context, it might slip quietly into the brain and remain silent, he says. Consider an old memory from high school that suddenly pops up. “Something triggered you to recall — something very specific — and that probably involves the conversion of a silent engram to an active engram,” Tonegawa says.

But engrams, silent or active, must still be holding memory information somehow. Tonegawa thinks that this information is stored not in synapses’ strength, but in synapses’ very existence. When a specific memory is formed, new connections are quickly forged, creating anatomical bridges between constellations of cells, he suspects. “That defines the content of memory,” he says. “That is the substrate.”

These newly formed synapses can then be beefed up, leading to the memory burbling up as an active engram, or pared down and weakened, leading to a silent engram. Tonegawa says this idea requires less energy than the LTP model, which holds that memory storage requires constantly revved up synapses full of numerous contact points. Synapse existence, he argues, can hold memory in a latent, low-maintenance state.
“The brain doesn’t even have to recognize that it’s a memory,” says Ryan, who shares this view. Stored in the anatomy of arrays of neurons, memory is “in the shape of the brain itself,” he says.
A temporary vessel
Tonegawa is confident that the very existence of physical links between neurons stores memories. But other researchers have their own notions.

Back in the 1950s, McConnell suspected that RNA, cellular material that can help carry out genetic instructions but can also carry information itself, might somehow store memories.

This unorthodox idea, that RNA is involved in memory storage, has at least one modern-day supporter in Glanzman, who plans to present preliminary data at a meeting in April that suggest injections of RNA can transfer memory between sea slugs.

Glanzman thinks that RNA is a temporary storage vessel for memories, though. The real engram, he suggests, is the folding pattern of DNA in cells’ nuclei. Changes to how tightly DNA is packed can govern how genes are deployed. Those changes, part of what’s known as the epigenetic code, can be made — and even transferred — by roving RNA molecules, Glanzman argues. He is quick to point out that his idea, memory transfer by RNA, is radical. “I don’t think you could find another card-carrying Ph.D. neuroscientist who believes that.”

Other researchers, including neurobiologist David Sweatt of Vanderbilt University in Nashville, also suspect that long-lasting epigenetic changes to DNA hold memories, an idea Sweatt has been pursuing for 20 years. Because epigenetic changes can be stable, “they possess the unique attribute necessary to contribute to the engram,” he says.

And still more engram ideas abound. Some results suggest that a protein called PKM-zeta, which helps keep synapses strong, preserves memories. Other evidence suggests a role for structures called perineuronal nets, rigid sheaths that wrap around neurons. Holes in these nets allow synapses to peek through, solidifying memories, the reasoning goes (SN: 11/14/15, p. 8). A different line of research focuses on proteins that incite others to misfold and aggregate around synapses, strengthening memories. Levin, at Tufts, has his own take. He thinks that bioelectrical signals, detected by voltage-sensing proteins on the outside of cells, can store memories, though he has no evidence yet.

Beyond the brain
Levin’s work on planarians, reminiscent of McConnell’s cannibal research, may even prod memory researchers to think beyond the brain. Planarians can remember the texture of their terrain, even using a new brain, Levin and Tal Shomrat, now at the Ruppin Academic Center in Mikhmoret, Israel, reported in 2013 in the Journal of Experimental Biology. The fact that memory somehow survived decapitation hints that signals outside of the brain may somehow store memories, even if temporarily.

Memory clues may also come from other animals that undergo extreme brain modification over their lifetimes. As caterpillars transition to moths, their brains change dramatically. But a moth that had learned as a caterpillar to avoid a certain odor paired with a shock holds onto that information, despite having a radically different brain, researchers have found.

Story continues below box
Similar results come from mammals, such as the Arctic ground squirrel, which massively prunes its synaptic connections to save energy during torpor in the winter. Within hours of the squirrel waking up, the pruned synaptic connections grow back. Remarkably, some old memories seem to survive the experience. The squirrels have been shown to remember familiar squirrels, as well as how to perform motor feats such as jumping between boxes and crawling through tubes. Even human babies retain taste and sound memories from their time in the womb, despite a very changed brain.

These extreme cases of memory persistence raise lots of basic questions about the nature of the engram, including whether memories must always be stored in the brain. But it’s important that the engram search isn’t restricted to the most advanced forms of humanlike memory, Levin says. “Memory starts, evolutionarily, very early on,” he says. “Single-celled organisms, bacteria, slime molds, fish, these things all have memory.… They have the ability to alter their future behavior based on what’s happened before.”

The diversity of ideas — and of experimental approaches — highlights just how unsettled the engram question remains. After decades of work, the field is still young. Even if the physical identity of the engram is eventually discovered and universally agreed on, a bigger question still looms, Levin says.

“The whole point of memory is that you should be able to look at the engram and immediately know what it means,” he says. Somehow, the physical message of a memory, in whatever form it ultimately takes, must be translated into the experience of a memory by the brain. But no one has a clue how this occurs.

At a 2016 workshop, a small group of researchers gathered to discuss engram ideas that move beyond synapse strength. “All the rebels came together,” Glanzman says. The two-day debate didn’t settle anything, but it was “very valuable,” says Ryan, who also attended. He coauthored a summary of the discussion that appeared in the May 2017 Annals of the New York Academy of Sciences. “Because the mind is part of the natural world, there is no reason to believe that it will be any less tangible and ultimately comprehensible than other components,” Ryan and coauthors optimistically wrote.

For now, the field hasn’t been able to explain memories in tangible terms. But research is moving forward, in part because of its deep implications. The hunt for memories gets at the very nature of identity, Levin says. “What does it mean to be a coherent individual that has a coherent bundle of memories?” The elusive identity of the engram may prove key to answering that question.