0 Cart
Added to Cart
    You have items in your cart
    You have 1 item in your cart
    Total

    News

    Brain study finds circuits that may help you keep your cool

    This confocal microscopy image of the locus coeruleus region of the mouse brain displays noradrenergic neurons in red and GABAergic neurons in cyan. A noradrenergic neuron recorded in the study is highlighted in white.

    The big day has come: You are taking your road test to get your driver’s license. As you start your mom’s car with a stern-faced evaluator in the passenger seat, you know you’ll need to be alert but not so excited that you make mistakes. Even if you are simultaneously sleep-deprived and full of nervous energy, you need your brain to moderate your level of arousal so that you do your best.

    Now a new study by neuroscientists at MIT’s Picower Institute for Learning and Memory might help to explain how the brain strikes that balance.

    “Human beings perform optimally at an intermediate level of alertness and arousal, where they are attending to appropriate stimuli rather than being either anxious or somnolent,” says Mriganka Sur, the Paul and Lilah E. Newton Professor in the Department of Brain and Cognitive Sciences. “But how does this come about?”

    Postdoc Vincent Breton-Provencher brought this question to the lab and led the study published Jan. 14 in Nature Neuroscience. In a series of experiments in mice, he shows how connections from around the mammalian brain stimulate two key cell types in a region called the locus coeruleus (LC) to moderate arousal in two different ways. A region particularly involved in exerting one means of this calming influence, the prefrontal cortex, is a center of executive function, which suggests there may indeed be a circuit for the brain to attempt conscious control of arousal.

    “We know, and a mouse knows, too, that to counter anxiety or excessive arousal one needs a higher level cognitive input,” says Sur, the study’s senior author.

    By explaining more about how the brain keeps arousal in check, Sur said, the study might also provide insight into the neural mechanisms contributing to anxiety or chronic stress, in which arousal appears insufficiently controlled. It might also provide greater mechanistic understanding of why cognitive behavioral therapy can help patients manage anxiety, Sur adds.

    Crucial characters in the story are neurons that release the neurotransmitter GABA, which has an inhibitory effect on the activity of receiving neurons. Before this study, according to Breton-Provencher and Sur, no one had ever studied the location and function of these neurons in the LC, which neurons connect to them, and how they might inhibit arousal. But because Breton-Provencher came to the Sur lab curious about how arousal is managed, he was destined to learn much about LC-GABA neurons.

    One of the first things he observed was that LC-GABA neurons were located within the LC in close proximity to neurons that release noradrenaline (NA), which stimulates arousal. He was also able to show that the LC-GABA neurons connect to the LC-NA neurons. This suggested that GABA may inhibit NA release.

    Breton-Provencher tested this directly by making a series of measurements in mice. Watching the LC work under a two-photon microscope, he observed, as expected, that LC-NA neuron activity precedes arousal, which was indicated by the pupil size of the mice — the more excited the mouse, the wider the pupil. He was even able to take direct control of this by engineering LC-NA cells to be controlled with pulses of light, a technique called optogenetics. He also took over LC-GABA neurons this way and observed that if he cranked those up, then he could suppress arousal, and therefore pupil size.

    The next question was which cells in which regions of the brain provide input to these LC cells. Using neural circuit tracing techniques, Breton-Provencher saw that cells in nearly 50 regions connected into the LC cells, and most of them connected to both the LC-NA and the LC-GABA neurons. But there were variations in the extent of overlap that turned out to be crucial.

    Breton-Provencher continued his work by exposing mice to arousal-inducing beeps of sound, while he watched activity among the cells in the LC. Making detailed measurements of the correlation between neural activity and arousal, he was able to see that the LC is actually home to two different kinds of inhibitory control.

    One type came about from those inputs — for instance from sensory processing circuits — that simultaneously connected into LC-GABA and LC-NA neurons. In that case, optogenetically inducing LC-GABA activity would moderate the mouse’s pupil dilation response to the loudness of the stimulating beep. The other type came about from inputs, notably including from the prefrontal cortex, that only connected into LC-GABA, but not LC-NA neurons. In that case, LC-GABA activity correlated with an overall reduced amount of arousal, independent of how startling the individual beeps were.

    In other words, input into both LC-NA and LC-GABA neurons by simultaneous connections kept arousal in check during a specific stimulus, while input just to LC-GABA neurons maintained a more general level of calm.

    In new research, Sur and Breton-Provencher say they are interested in examining the activity of LC-NA cells in other behavioral situations. They are also curious to learn whether early life stress in mouse models affects the development of the LC’s arousal control circuitry such that individuals could become at greater risk for chronic stress in adulthood.

    The study was funded by the National Institutes of Health, postdoctoral fellowship funding from the Fonds de recherche du Québec, the Natural Sciences and Engineering Research Council of Canada, and the JPB Foundation.

    By: 

    How the brain distinguishes between objects

    Study shows that a brain region called the inferotemporal cortex is key to differentiating bears from chairs.

    As visual information flows into the brain through the retina, the visual cortex transforms the sensory input into coherent perceptions. Neuroscientists have long hypothesized that a part of the visual cortex called the inferotemporal (IT) cortex is necessary for the key task of recognizing individual objects, but the evidence has been inconclusive.

    In a new study, MIT neuroscientists have found clear evidence that the IT cortex is indeed required for object recognition; they also found that subsets of this region are responsible for distinguishing different objects.

    In addition, the researchers have developed computational models that describe how these neurons transform visual input into a mental representation of an object. They hope such models will eventually help guide the development of brain-machine interfaces (BMIs) that could be used for applications such as generating images in the mind of a blind person.

    “We don’t know if that will be possible yet, but this is a step on the pathway toward those kinds of applications that we’re thinking about,” says James DiCarlo, the head of MIT’s Department of Brain and Cognitive Sciences, a member of the McGovern Institute for Brain Research, and the senior author of the new study.

    Rishi Rajalingham, a postdoc at the McGovern Institute, is the lead author of the paper, which appears in the March 13 issue of Neuron

    Image result for various objects outside

    Distinguishing objects

    In addition to its hypothesized role in object recognition, the IT cortex also contains “patches” of neurons that respond preferentially to faces. Beginning in the 1960s, neuroscientists discovered that damage to the IT cortex could produce impairments in recognizing non-face objects, but it has been difficult to determine precisely how important the IT cortex is for this task.

    The MIT team set out to find more definitive evidence for the IT cortex’s role in object recognition, by selectively shutting off neural activity in very small areas of the cortex and then measuring how the disruption affected an object discrimination task. In animals that had been trained to distinguish between objects such as elephants, bears, and chairs, they used a drug called muscimol to temporarily turn off subregions about 2 millimeters in diameter. Each of these subregions represents about 5 percent of the entire IT cortex.

    These experiments, which represent the first time that researchers have been able to silence such small regions of IT cortex while measuring behavior over many object discriminations, revealed that the IT cortex is not only necessary for distinguishing between objects, but it is also divided into areas that handle different elements of object recognition.  

    The researchers found that silencing each of these tiny patches produced distinctive impairments in the animals’ ability to distinguish between certain objects. For example, one subregion might be involved in distinguishing chairs from cars, but not chairs from dogs. Each region was involved in 25 to 30 percent of the tasks that the researchers tested, and regions that were closer to each other tended to have more overlap between their functions, while regions far away from each other had little overlap.

    “We might have thought of it as a sea of neurons that are completely mixed together, except for these islands of “face patches.” But what we’re finding, which many other studies had pointed to, is that there is large-scale organization over the entire region,” Rajalingham says.

    The features that each of these regions are responding to are difficult to classify, the researchers say. The regions are not specific to objects such as dogs, nor easy-to-describe visual features such as curved lines.

    “It would be incorrect to say that because we observed a deficit in distinguishing cars when a certain neuron was inhibited, this is a ‘car neuron,’” Rajalingham says. “Instead, the cell is responding to a feature that we can’t explain that is useful for car discriminations. There has been work in this lab and others that suggests that the neurons are responding to complicated nonlinear features of the input image. You can’t say it’s a curve, or a straight line, or a face, but it’s a visual feature that is especially helpful in supporting that particular task.”

    Bevil Conway, a principal investigator at the National Eye Institute, says the new study makes significant progress toward answering the critical question of how neural activity in the IT cortex produces behavior.

    “The paper makes a major step in advancing our understanding of this connection, by showing that blocking activity in different small local regions of IT has a different selective deficit on visual discrimination. This work advances our knowledge not only of the causal link between neural activity and behavior but also of the functional organization of IT: How this bit of brain is laid out,” says Conway, who was not involved in the research.

    Brain-machine interface

    The experimental results were consistent with computational models that DiCarlo, Rajalingham, and others in their lab have created to try to explain how IT cortex neuron activity produces specific behaviors.

    “That is interesting not only because it says the models are good, but because it implies that we could intervene with these neurons and turn them on and off,” DiCarlo says. “With better tools, we could have very large perceptual effects and do real BMI in this space.”

    The researchers plan to continue refining their models, incorporating new experimental data from even smaller populations of neurons, in hopes of developing ways to generate visual perception in a person’s brain by activating a specific sequence of neuronal activity. Technology to deliver this kind of input to a person’s brain could lead to new strategies to help blind people see certain objects.

    “This is a step in that direction,” DiCarlo says. “It’s still a dream, but that dream someday will be supported by the models that are built up by this kind of work.”

    The research was funded by the National Eye Institute, the Office of Naval Research, and the Simons Foundation.


    Too much structured knowledge hurts creativity, study shows

    Structure organizes human activities and help us understand the world with less effort, but it can be the killer of creativity, concludes a study from the University of Toronto's Rotman School of Management.

    While most management research has supported the idea that giving structure to information makes it easier to cope with its complexity and boosts efficiency, the paper says that comes as a double-edged sword.

    "A hierarchically organized information structure may also have a dark side," warns Yeun Joon Kim, a PhD student who co-authored the paper with Chen-Bo Zhong, an associate professor of organizational behaviour and human resource management at the Rotman School.

    The researchers showed in a series of experiments that participants displayed less creativity and cognitive flexibility when asked to complete tasks using categorized sets of information, compared to those asked to work with items that were not ordered in any special way. Those in the organized information group also spent less time on their tasks, suggesting reduced persistence, a key ingredient for creativity.

    The researchers ran three experiments. In two, study participants were presented with a group of nouns that were either organized into neat categories or not, and then told to make as many sentences as they could with them.

    The third experiment used LEGO® bricks. Participants were asked to make an alien out of a box of bricks organized by colour and shape or, in a scenario familiar to many parents, out of a box of unorganized bricks. Participants in the organized category were prohibited from dumping the bricks out onto a table.

    The findings may have application for leaders of multi-disciplinary teams, which tend to show inconsistent rates of innovation, perhaps because team members may continue to organize their ideas according to functional similarity, area of their expertise, or discipline.

    "We suggest people put their ideas randomly on a white board and then think about some of their connections," says Kim. Our tendency to categorize information rather than efficiency itself is what those working in creative industries need to be most on guard about, the researchers say.

    Source:University of Toronto, Rotman School of Management

    Brain waves may focus attention and keep information flowing

    Studies suggest the oscillations created by nerve cell activity have roles of their own.

    We can’t see it, but brains hum with electrical activity. Brain waves created by the coordinated firing of huge collections of nerve cells pinball around the brain. The waves can ricochet from the front of the brain to the back, or from deep structures all the way to the scalp and then back again.

    Called neuronal oscillations, these signals are known to accompany certain mental states. Quiet alpha waves ripple soothingly across the brains of meditating monks. Beta waves rise and fall during intense conversational turns. Fast gamma waves accompany sharp insights. Sluggish delta rhythms lull deep sleepers, while dreamers shift into slightly quicker theta rhythms.

    Researchers have long argued over whether these waves have purpose, and what those purposes might be. Some scientists see waves as inevitable but useless by-products of the signals that really matter — messages sent by individual nerve cells. Waves are simply a consequence of collective neural behavior, and nothing more, that view holds. But a growing body of evidence suggests just the opposite: Instead of by-products of important signals, brain waves are key to how the brain operates, routing information among far-flung brain regions that need to work together.

    MIT’s Earl Miller is among the neuro­scientists amassing evidence that waves are an essential part of how the brain operates. Brain oscillations deftly route information in a way that allows the brain to choose which signals in the world to pay attention to and which to ignore, his recent studies suggest.

    Other research supports this view, too. Studies on people with electrodes implanted in their brains suggest brain waves, and their interactions, help enable emotion, language, vision and more.

    When these waves are abnormal, brainpower suffers, studies find. Detailed looks at how the brain uses these waves raise the possibility of tweaking the signals with electrical nudges — interventions that could lead to therapies that can correct memory problems and mental illness, for instance. Already, early attempts have led to improvements in people’s memory.

    Types of brain waves

    Scientists are studying how oscillations generated by nerve cells affect brain function. Although the boundaries between different wave types can be fuzzy, these oscillations can be grouped by frequency.

    Fast gamma waves have been linked to states of high attention. 30 to 80 Hz

    Beta waves may be involved in movement and complex tasks such as memory and decision making.  12 to 30 Hz

    The first neuronal oscillations discovered, alphawaves appear when a relaxed person closes his eyes. 8 to 12 Hz

    Theta oscillations may help the brain sort information essential for navigation. 4 to 8 Hz

    Slow delta waves mark deep sleep and anesthesia. 1.5 to 4 Hz

    C. CHANG

    These insights about brain waves coincide with a shift in neuroscience away from a view that reduces the brain down to the behavior of single nerve cells, or neurons. That’s like thinking of the brain as “a giant clock, and if you figure out each gear, you’ll figure out the brain,” Miller says. But “it’s not just individual neurons in a giant clock. It’s networks interacting in a very dynamic, fluid way.”

    Central to those interactions, Miller and others think, are coordinated brain waves. “The oscillations are the most powerful signal in the brain,” Miller says. “How could evolution not have taken advantage of that?”

    In three recent papers, Miller and colleagues argue that two different types of brain waves — beta and gamma — work together to selectively choose the information that makes it into working memory. Gamma waves that cycle 30 to 80 times per second (30 to 80 hertz) help coordinate information streaming in from our senses — what we feel, see and smell. In contrast, slower 12 to 30 Hz beta waves are the messages that help keep us on task by guiding the brain toward the sensory signals worth paying attention to.

    These two types of brain oscillations engage in a neural seesaw: When beta waves are strong, akin to a stereo blasting, gamma waves are weak, as if the volume had been dialed down, and vice versa. Miller and colleagues saw this push-and-pull action in the brains of monkeys with implanted electrodes as the animals completed a tricky memory task, one that required the monkeys to hold several pieces of information in their minds at the same time. The results were described January 26 in Nature Communications. “At all these complex decision points, you can see the beta and gamma doing this complex dance in a way that you’d expect if they’re controlling working memory,” Miller says.

    These two types of waves were generated in different parts of the brain, offering spatial clues about how the brain focuses itself, the researchers also found. Sensory information, organized by gamma waves, skims the superficial layers of the brain, experiments on monkeys showed. But slower, more goal-directed waves, a mix of alpha and beta waves, are deeper in the brain. And those slower, deeper waves could actually dial down the strength of the gamma waves that rippled along the outer brain. The deeper waves were selecting which sensory information to pay attention to, the researchers proposed in the Jan. 30 Proceedings of the National Academy of Sciences.

    A third paper, in the Feb. 7 Neuron, shows similar interactions between gamma and beta waves while monkeys matched patterns of dots on a computer screen. Some of the patterns were clearly different but still belonged to the same category, an easy task akin to knowing that both a dog and a cat are types of animals. Other times, the patterns were harder to classify and required more sophisticated mental work, similar to knowing that trains and bicycles are both types of transportation. Gamma waves were present when the monkeys were puzzling out an easy category. But when higher-level categorization was required, beta waves started to roll.  

    These interactions between gamma and beta waves might be how the brain solves an information overload problem, Miller suspects. Incoming sensory input constantly bombards the brain, and much of it is meaningless. The brain needs a way to figure out if it should ignore the feeling of a scratchy shirt, but pay attention to the ringing phone. These two rhythms may offer a way for “volitional control over what you think about,” Miller says, allowing a person to consciously choose what information to bring to mind. 

    Oscillations may also shape visual information as it travels through the brain, says Charles Schroeder, a neuroscientist at the Nathan S. Kline Institute for Psychiatric Research in Orangeburg, N.Y. He and colleagues are studying a different ebb and flow of oscillations from the one Miller recently described. This one probably involves a host of different kinds of waves, including theta waves, and happens in the split second when your eyes dwell on a scene — a pause that usually lasts about 200 milliseconds.

    When you look at a scene, the first half of the time it takes to stare is spent on visual information streaming into your brain. But toward the end of that fixation time, “the information flow reverses,” Schroeder says. Different neuronal oscillations carry signals from the brain’s command center, ready to direct the eyes to the next spot. “Literally within a tenth of a second before you move your eyes, there is this incredible flash of network activity in the front of the brain, and then the eyes move,” Schroeder says. “It’s really a dramatic thing.” Schroeder and colleagues have caught this action in monkeys’ brains, and more recently, in people implanted with electrodes as part of epilepsy treatment.

    But some vision researchers still dismiss these oscillations as noise, convinced that the activity of single neurons — and not the collective waves that result from that activity — is the key to understanding the brain, Schroeder says. “It’s still difficult to convince people that brain oscillations are functional.”

    Researchers may argue over the function of brain waves for years to come, says neuroscientist and neurologist Robert Knight of the University of California, Berkeley. He believes that information, at its core, is held in the signals zipped off by neurons. But work from his lab has convinced him that oscillations help those signals reach the right spot, connecting brain areas in important ways. “You’ve got to have a way to get brain areas communicating,” he says. “What oscillations do is provide a routing mechanism.”

    And oscillations do this quickly, he says. Human brains are incredibly fast. “We’re handling massive amounts of information in subsecond time periods,” Knight says. “And you have to have some way to shape it, to control it.” Waves, he says, give the brain a way to tune out extraneous information by temporarily shutting down unnecessary communication lines.

    Knight and colleagues recently spotted fast gamma waves at work as people did a wide range of tasks, including repeating words, answering questions about themselves and distinguishing male faces from female. A certain gamma wave pattern seemed to predict when people would get the right answer on these tasks, the team reported in December 2017 in Nature Human Behavior. Gamma waves, the team suspects, link up areas of the brain that are needed to turn goals into action.

    If oscillations are crucial information routers in the brain, then changing them might be beneficial when information is distorted or lost. Altered oscillations have been observed in disorders such as autism, Parkinson’s disease, depression and anxiety, and even in normal aging.

    A study published February 6 in Nature Communications hints at the potential of tweaking these rhythms. Youssef Ezzyat, a neuroscientist at the University of Pennsylvania, and colleagues studied memory abilities in 25 people who had electrodes implanted in their brains as part of their epilepsy treatment.

    As the researchers gave the people lists of words to remember, electrodes monitored neural oscillations. A computer algorithm then figured out which assortment of brain waves indicated when a person was likely to remember the word, an assortment that varied slightly from person to person.

    When those good performance signals were missing, the researchers delivered a short burst of electricity to the brain — “a bit of a nudge to course correct,” Ezzyat says. And these nudges improved performance.

    Specifically manipulating brain waves to treat brains is a long way off, Ezzyat cautions. But he and colleagues are making progress. In the meantime, his results and others’ are powerful signs that brain waves aren’t just an idle hum.

    BY: 

     

    Citations

    A. Wutz et al. Different levels of category abstraction by different dynamics in different prefrontal areas. Neuron. Vol. 97, February 7, 2018, p. 1. doi: 10.1016/j.neuron.2018.01.009.

    Y. Ezzyat et al. Closed-loop stimulation of temporal cortex rescues functional networks and improves memory. Nature Communications. Published online February 6, 2018. doi: 10.1038/s41467-017-02753-0.

    M. Lundqvist et al. Gamma and beta bursts during working memory readout suggest roles in its volitional control. Nature Communications. Published online January 26, 2018. doi: 10.1038/s41467-017-02791-8.

    A. Bastos et al. Laminar recordings in frontal cortex suggest distinct layers for maintenance and control of working memory. Proceedings of the National Academy of Sciences. Vol. 115, January 30, 2018, p. 1117. doi: 10.1073/pnas.1710323115.

    M. Haller et al. Persistent neuronal activity in human prefrontal cortex links perception and action. Nature Human Behavior. December 18, 2017. doi: 10.1038/s41562-017-0267-2.

     

    The Simple Reason Exercise Enhances Your Brain

    Evidence keeps mounting that exercise is good for the brain. It can lower a person's risk for Alzheimer's disease and may even slow brain aging by about 10 years. Now, new research helps illuminate how, exactly, working out improves brain health.

    In one research review published in the British Journal of Sports Medicine, researchers examined 39 studies that looked at the link between exercise and cognitive abilities among people over age 50. They found that aerobic exercise appears to improve a person’s cognitive function and resistance training can enhance a person’s executive function and memory. Other exercises like tai chi were also linked to improvements in cognition, though there wasn’t as much available evidence. Ultimately, the researchers concluded that 45 minutes to an hour of moderate-to-vigorous exercise was good for the brain.

    “There is now a wide body of research showing that the benefits to the body with exercise also exist for the brain,” says study author Joe Northey, a PhD candidate at the University of Canberra Research Institute for Sport and Exercise in Australia. “When older adults undertake aerobic or resistance exercise, we see changes to the structure and function of areas of the brain responsible for complex mental tasks and memory function.”

    But how does exercise have these effects? Another new study presented at the American Physiological Society’s annual meeting in Chicago explored one possible way. In the study, researchers from New Mexico Highlands University found that when people walk, the pressure of making impact with the ground sends waves through the arteries, which increase blood flow to the brain (also called cerebral blood flow). Getting enough blood to the brain is important for healthy brain function, since blood flow brings the brain oxygen and nutrients.

    In the small study—which has not yet been published—researchers used ultrasounds to assess arteries and changes in cerebral blood flow in 12 healthy young adults while they were standing, walking and running. The increases in blood flow were greater when the men and women ran, but walking was enough to spur the effect. “[Increased cerebral blood flow] gives the brain more to work with,” says study author Ernest R. Greene, a professor of engineering and biology at New Mexico Highlands University. “It’s another positive aspect of exercise.”

    Scientists are still exploring multiple ways by which fitness improves the brain. But blood flow is a promising path, since it can also help create new brain cells. The protein BDNF (brain-derived neurotrophic factor) also seems to play a role because it helps repair and protect brain cells from degeneration. Exercise can also boost mood by triggering the release of feel-good hormones and chemicals, like endorphins, which can improve brain health. A 2015 study found that exercise may be able to prevent the onset of depressive symptoms. 

    “Each type of exercise seems to have different effects on the growth factors responsible for the growth of new neurons and blood vessels in the brain,” says Northey. “That may indicate why doing both aerobic and resistance training is of benefit to cognitive function.”

    By: Alexandra Sifferlin

    Sale

    Unavailable

    Sold Out