0 Cart
Added to Cart
    You have items in your cart
    You have 1 item in your cart
    Total

    News

    Everything You Need to Know about Functional Supplements

    Functional supplements are taken by people to fill a specific nutrient deficiency or improve a distinct area of their health. Types of functional supplements vary and may include detox supplements, digestion supplements, focus supplements, immunology supplements, and sleep supplements. For functional supplements, many benefits may exist including a lower risk of medical conditions, improvements to health, and an increase in brain function.*  When taking functional supplements, you should follow the directions on the supplement package and research any ingredients listed on the label that you are unsure of. Take time to learn about functional supplements and how they can benefit your health.

    Brain study finds circuits that may help you keep your cool

    This confocal microscopy image of the locus coeruleus region of the mouse brain displays noradrenergic neurons in red and GABAergic neurons in cyan. A noradrenergic neuron recorded in the study is highlighted in white.

    The big day has come: You are taking your road test to get your driver’s license. As you start your mom’s car with a stern-faced evaluator in the passenger seat, you know you’ll need to be alert but not so excited that you make mistakes. Even if you are simultaneously sleep-deprived and full of nervous energy, you need your brain to moderate your level of arousal so that you do your best.

    Now a new study by neuroscientists at MIT’s Picower Institute for Learning and Memory might help to explain how the brain strikes that balance.

    “Human beings perform optimally at an intermediate level of alertness and arousal, where they are attending to appropriate stimuli rather than being either anxious or somnolent,” says Mriganka Sur, the Paul and Lilah E. Newton Professor in the Department of Brain and Cognitive Sciences. “But how does this come about?”

    Postdoc Vincent Breton-Provencher brought this question to the lab and led the study published Jan. 14 in Nature Neuroscience. In a series of experiments in mice, he shows how connections from around the mammalian brain stimulate two key cell types in a region called the locus coeruleus (LC) to moderate arousal in two different ways. A region particularly involved in exerting one means of this calming influence, the prefrontal cortex, is a center of executive function, which suggests there may indeed be a circuit for the brain to attempt conscious control of arousal.

    “We know, and a mouse knows, too, that to counter anxiety or excessive arousal one needs a higher level cognitive input,” says Sur, the study’s senior author.

    By explaining more about how the brain keeps arousal in check, Sur said, the study might also provide insight into the neural mechanisms contributing to anxiety or chronic stress, in which arousal appears insufficiently controlled. It might also provide greater mechanistic understanding of why cognitive behavioral therapy can help patients manage anxiety, Sur adds.

    Crucial characters in the story are neurons that release the neurotransmitter GABA, which has an inhibitory effect on the activity of receiving neurons. Before this study, according to Breton-Provencher and Sur, no one had ever studied the location and function of these neurons in the LC, which neurons connect to them, and how they might inhibit arousal. But because Breton-Provencher came to the Sur lab curious about how arousal is managed, he was destined to learn much about LC-GABA neurons.

    One of the first things he observed was that LC-GABA neurons were located within the LC in close proximity to neurons that release noradrenaline (NA), which stimulates arousal. He was also able to show that the LC-GABA neurons connect to the LC-NA neurons. This suggested that GABA may inhibit NA release.

    Breton-Provencher tested this directly by making a series of measurements in mice. Watching the LC work under a two-photon microscope, he observed, as expected, that LC-NA neuron activity precedes arousal, which was indicated by the pupil size of the mice — the more excited the mouse, the wider the pupil. He was even able to take direct control of this by engineering LC-NA cells to be controlled with pulses of light, a technique called optogenetics. He also took over LC-GABA neurons this way and observed that if he cranked those up, then he could suppress arousal, and therefore pupil size.

    The next question was which cells in which regions of the brain provide input to these LC cells. Using neural circuit tracing techniques, Breton-Provencher saw that cells in nearly 50 regions connected into the LC cells, and most of them connected to both the LC-NA and the LC-GABA neurons. But there were variations in the extent of overlap that turned out to be crucial.

    Breton-Provencher continued his work by exposing mice to arousal-inducing beeps of sound, while he watched activity among the cells in the LC. Making detailed measurements of the correlation between neural activity and arousal, he was able to see that the LC is actually home to two different kinds of inhibitory control.

    One type came about from those inputs — for instance from sensory processing circuits — that simultaneously connected into LC-GABA and LC-NA neurons. In that case, optogenetically inducing LC-GABA activity would moderate the mouse’s pupil dilation response to the loudness of the stimulating beep. The other type came about from inputs, notably including from the prefrontal cortex, that only connected into LC-GABA, but not LC-NA neurons. In that case, LC-GABA activity correlated with an overall reduced amount of arousal, independent of how startling the individual beeps were.

    In other words, input into both LC-NA and LC-GABA neurons by simultaneous connections kept arousal in check during a specific stimulus, while input just to LC-GABA neurons maintained a more general level of calm.

    In new research, Sur and Breton-Provencher say they are interested in examining the activity of LC-NA cells in other behavioral situations. They are also curious to learn whether early life stress in mouse models affects the development of the LC’s arousal control circuitry such that individuals could become at greater risk for chronic stress in adulthood.

    The study was funded by the National Institutes of Health, postdoctoral fellowship funding from the Fonds de recherche du Québec, the Natural Sciences and Engineering Research Council of Canada, and the JPB Foundation.

    By: 

    How the brain distinguishes between objects

    Study shows that a brain region called the inferotemporal cortex is key to differentiating bears from chairs.

    As visual information flows into the brain through the retina, the visual cortex transforms the sensory input into coherent perceptions. Neuroscientists have long hypothesized that a part of the visual cortex called the inferotemporal (IT) cortex is necessary for the key task of recognizing individual objects, but the evidence has been inconclusive.

    In a new study, MIT neuroscientists have found clear evidence that the IT cortex is indeed required for object recognition; they also found that subsets of this region are responsible for distinguishing different objects.

    In addition, the researchers have developed computational models that describe how these neurons transform visual input into a mental representation of an object. They hope such models will eventually help guide the development of brain-machine interfaces (BMIs) that could be used for applications such as generating images in the mind of a blind person.

    “We don’t know if that will be possible yet, but this is a step on the pathway toward those kinds of applications that we’re thinking about,” says James DiCarlo, the head of MIT’s Department of Brain and Cognitive Sciences, a member of the McGovern Institute for Brain Research, and the senior author of the new study.

    Rishi Rajalingham, a postdoc at the McGovern Institute, is the lead author of the paper, which appears in the March 13 issue of Neuron

    Image result for various objects outside

    Distinguishing objects

    In addition to its hypothesized role in object recognition, the IT cortex also contains “patches” of neurons that respond preferentially to faces. Beginning in the 1960s, neuroscientists discovered that damage to the IT cortex could produce impairments in recognizing non-face objects, but it has been difficult to determine precisely how important the IT cortex is for this task.

    The MIT team set out to find more definitive evidence for the IT cortex’s role in object recognition, by selectively shutting off neural activity in very small areas of the cortex and then measuring how the disruption affected an object discrimination task. In animals that had been trained to distinguish between objects such as elephants, bears, and chairs, they used a drug called muscimol to temporarily turn off subregions about 2 millimeters in diameter. Each of these subregions represents about 5 percent of the entire IT cortex.

    These experiments, which represent the first time that researchers have been able to silence such small regions of IT cortex while measuring behavior over many object discriminations, revealed that the IT cortex is not only necessary for distinguishing between objects, but it is also divided into areas that handle different elements of object recognition.  

    The researchers found that silencing each of these tiny patches produced distinctive impairments in the animals’ ability to distinguish between certain objects. For example, one subregion might be involved in distinguishing chairs from cars, but not chairs from dogs. Each region was involved in 25 to 30 percent of the tasks that the researchers tested, and regions that were closer to each other tended to have more overlap between their functions, while regions far away from each other had little overlap.

    “We might have thought of it as a sea of neurons that are completely mixed together, except for these islands of “face patches.” But what we’re finding, which many other studies had pointed to, is that there is large-scale organization over the entire region,” Rajalingham says.

    The features that each of these regions are responding to are difficult to classify, the researchers say. The regions are not specific to objects such as dogs, nor easy-to-describe visual features such as curved lines.

    “It would be incorrect to say that because we observed a deficit in distinguishing cars when a certain neuron was inhibited, this is a ‘car neuron,’” Rajalingham says. “Instead, the cell is responding to a feature that we can’t explain that is useful for car discriminations. There has been work in this lab and others that suggests that the neurons are responding to complicated nonlinear features of the input image. You can’t say it’s a curve, or a straight line, or a face, but it’s a visual feature that is especially helpful in supporting that particular task.”

    Bevil Conway, a principal investigator at the National Eye Institute, says the new study makes significant progress toward answering the critical question of how neural activity in the IT cortex produces behavior.

    “The paper makes a major step in advancing our understanding of this connection, by showing that blocking activity in different small local regions of IT has a different selective deficit on visual discrimination. This work advances our knowledge not only of the causal link between neural activity and behavior but also of the functional organization of IT: How this bit of brain is laid out,” says Conway, who was not involved in the research.

    Brain-machine interface

    The experimental results were consistent with computational models that DiCarlo, Rajalingham, and others in their lab have created to try to explain how IT cortex neuron activity produces specific behaviors.

    “That is interesting not only because it says the models are good, but because it implies that we could intervene with these neurons and turn them on and off,” DiCarlo says. “With better tools, we could have very large perceptual effects and do real BMI in this space.”

    The researchers plan to continue refining their models, incorporating new experimental data from even smaller populations of neurons, in hopes of developing ways to generate visual perception in a person’s brain by activating a specific sequence of neuronal activity. Technology to deliver this kind of input to a person’s brain could lead to new strategies to help blind people see certain objects.

    “This is a step in that direction,” DiCarlo says. “It’s still a dream, but that dream someday will be supported by the models that are built up by this kind of work.”

    The research was funded by the National Eye Institute, the Office of Naval Research, and the Simons Foundation.


    A Look at Nootropics

    In general, people will always want to improve themselves whether it is by becoming faster, smarter, fitter or in any other capacity. Because of that, many products are released on a yearly basis that claim to improve these aspects. One that is new to many are Nootropics, which has quickly become popular among those looking to boost their mental performance.

    Becoming popular in this specific market is quite an accomplishment because the entire market was valued at $2.3 billion in 2015 and is set to be worth between $11 and $12 billion by 2024. Even with their growing popularity, people are still not adequately educated on the background of Nootropics and their benefits. By taking time to study Nootropics, you may be able to see why  many are interested in the way they may be able to work to improve mental acuity which has influence over many unique areas of life.



    What Are Nootropics?

    Nootropics come in a variety of different forms including caffeine, prescription medication, and supplements.  Nootropics are used by people who wish to improve various aspects of their cognitive function. Due to the numerous ways Nootropics work, they are being used by people for a number of reasons.  For example, students are interested in them to help study and in  business people are interested in how they may help focus on their work responsibilities. In addition, Nootropics are used by the medical industry for ADD and ADHD.

    Over the past few years, “Nootropics” has been known by a number of different names including smart and limitless pills.  

    However, they have become increasingly popular among people looking to boost cognitive function over short and long periods of time.

    The Benefits of Nootropics

    Aside from treating specific disorders, there are a host of potential benefits of Nootropics, which is why popularity has spread so much over recent years.


    1. Focus

    One of the key reasons people take Nootropics is to help with focus. Similar to coffee, they can stimulate alertness. Because of that, many people have reported a faster and more in-depth learning ability than they did prior to taking a Nootropic.

    1. Mood Enhancer

    While this may apply to only specific types of Nootropics, many take Nootropics because they are looking for ways to try and enhance their mood and reduce anxiety.

          3.  Increased Energy Levels

    Similar to the effects of caffeine, Nootropics are often taken by people who want to increase mental energy levels for concentration at task.

         4. Reasoning & Problem Solving

    There is an even smaller subset of Nootropics that people take especially for influencing reasoning and general problem-solving skills and complex task.

    .

    The Side Effects of Nootropics

    Side effects do not affect the majority of users but should be given consideration before use, and when in doubt people should consult their health professional on whether or not a Nootropic may be right for them.  Underlying health conditions might play a factor in whether or not side effects may result from taking Nootropics.

     


    With the potnetial benefits Nootropics may offer, it is easy to see why Nootropics have become popular among those looking to increase cognitive performance in their everyday activities.

    Please note:  The content of this blog is not written by a medical professional, therefore it must not be relied upon as medical advice.  This blog does not make any guarantees or promises regarding the accuracy, reliability or completeness of the information presented.  It is in no way a substitute for professional advice. A blog has ever changing content and can include conversations and comments.   Blogs on this website are in no way official statements on behalf of the company. This blog contains opinions and does not reflect the opinions of any organizations.   The information included in this blog is accurate and true to the best of the bloggers knowledge, but there may be omissions, errors or mistakes. If readers rely on any information of this blog, it is at their own risk. The statements of this blog have not been evaluated by the Food and Drug Administration.  Products are not intended to diagnose, treat, cure, or prevent disease.  If you have any concerns about your own health, you should always consult with a physician or other health care professional.  The information presented in this blog is for entertainment and/or informational purposes only and should not be seen as any kind of advice, such a medical, legal, tax, emotional or other types of advice.

    Too much structured knowledge hurts creativity, study shows

    Structure organizes human activities and help us understand the world with less effort, but it can be the killer of creativity, concludes a study from the University of Toronto's Rotman School of Management.

    While most management research has supported the idea that giving structure to information makes it easier to cope with its complexity and boosts efficiency, the paper says that comes as a double-edged sword.

    "A hierarchically organized information structure may also have a dark side," warns Yeun Joon Kim, a PhD student who co-authored the paper with Chen-Bo Zhong, an associate professor of organizational behaviour and human resource management at the Rotman School.

    The researchers showed in a series of experiments that participants displayed less creativity and cognitive flexibility when asked to complete tasks using categorized sets of information, compared to those asked to work with items that were not ordered in any special way. Those in the organized information group also spent less time on their tasks, suggesting reduced persistence, a key ingredient for creativity.

    The researchers ran three experiments. In two, study participants were presented with a group of nouns that were either organized into neat categories or not, and then told to make as many sentences as they could with them.

    The third experiment used LEGO® bricks. Participants were asked to make an alien out of a box of bricks organized by colour and shape or, in a scenario familiar to many parents, out of a box of unorganized bricks. Participants in the organized category were prohibited from dumping the bricks out onto a table.

    The findings may have application for leaders of multi-disciplinary teams, which tend to show inconsistent rates of innovation, perhaps because team members may continue to organize their ideas according to functional similarity, area of their expertise, or discipline.

    "We suggest people put their ideas randomly on a white board and then think about some of their connections," says Kim. Our tendency to categorize information rather than efficiency itself is what those working in creative industries need to be most on guard about, the researchers say.

    Source:University of Toronto, Rotman School of Management

    Sale

    Unavailable

    Sold Out

    x