<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/study-identifies-creativity-and-resilience-as-positive-aspects-of-adhd-diagnosis/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Study identifies creativity and resilience as positive aspects of ADHD diagnosis</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 22nd 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Research published in <em><a href="https://bmjopen.bmj.com/content/13/10/e072052" target="_blank">BMJ Open</a></em> suggests that adults with attention deficit hyperactivity disorder often perceive specific personal strengths directly linked to their diagnosis. These positive attributes include high energy, creativity, and a unique capacity for resilience. The findings indicate that incorporating these perceived strengths into therapy could improve treatment outcomes and reduce stigma.</p>
<p>Attention deficit hyperactivity disorder is a neurodevelopmental condition. It affects how the brain processes information and regulates behavior. Medical professionals diagnose the condition based on symptoms like inattention, hyperactivity, and impulsivity. These symptoms stem from differences in brain development and chemistry.</p>
<p>Individuals with this diagnosis often have lower availability of certain neurotransmitters. Dopamine and norepinephrine are chemicals that allow brain cells to communicate. They play major roles in executive function. This includes planning, prioritizing, and controlling impulses. When these chemicals do not function typically, people struggle to sustain attention or manage their actions.</p>
<p>Historically, medical literature has viewed this condition through a deficit model. This perspective focuses almost exclusively on impairments and negative outcomes. It highlights challenges in education, employment, and relationships. This deficit-oriented view can contribute to public stigma. It may also lower self-esteem in those with the diagnosis.</p>
<p>A growing movement known as neurodiversity offers an alternative framework. This perspective views neurological differences as natural variations in the human genome rather than defects. Proponents argue that these differences can benefit society. They suggest that traits associated with the condition might have offered evolutionary advantages in the past.</p>
<p>Researchers at the University of Bergen in Norway sought to explore this positive perspective. They aimed to identify the specific benefits adults with the diagnosis experience in their daily lives. The research team included Emilie S. Nordby, Frode Guribye, Tine Nordgreen, and Astri J. Lundervold. They recognized a gap in existing literature, which heavily favors clinical samples of children and focuses on dysfunction.</p>
<p>The team recruited 50 adults who were already participating in an online intervention trial. These individuals were seeking help for their symptoms. The participants were mostly women who had received their diagnoses in adulthood. This demographic is significant because women are often diagnosed later in life than men.</p>
<p>The study employed a qualitative survey design. Participants responded to open-ended questions about the positive aspects of their condition. The researchers then used thematic analysis to interpret these written responses. This method involves identifying, analyzing, and reporting patterns or themes within the data.</p>
<p>The analysis revealed four primary themes regarding the positive experiences of living with the condition. The first theme identified was the dual impact of specific characteristics. Participants described how core symptoms could be both a hindrance and a help. The context determined whether a trait was positive or negative.</p>
<p>High energy levels served as a prime example of this duality. Many respondents reported that their energy allowed them to accomplish significant amounts of work. One participant noted, “I am active. I am often able to do a lot in a short amount of time, and then I get to experience more.” This endurance proved beneficial for physical labor, sports, and demanding projects.</p>
<p>Hyperfocus was another frequently mentioned double-edged trait. This state involves intense concentration on a specific task to the exclusion of everything else. Participants reported that this ability allowed them to complete educational courses and job assignments efficiently. However, they noted it was only a positive attribute when directed toward useful tasks.</p>
<p>The second theme centered on having an unconventional mind. Participants frequently cited creativity and the ability to think differently as major assets. They described themselves as solution-oriented and capable of seeing perspectives that others might miss. This out-of-the-box thinking was seen as an advantage in parenting and professional problem-solving.</p>
<p>Social nonconformity also emerged as part of this theme. Some participants felt their diagnosis allowed them to care less about societal norms. They described being straightforward and uninhibited in social situations. One woman stated, “I am pretty forward, and I am not afraid to take up space when I need a bit of attention.”</p>
<p>The third theme involved the pursuit of new experiences. Respondents described a strong drive for novelty and learning. They characterized themselves as curious, adventurous, and willing to take risks. This trait often led them to acquire knowledge across a broad range of topics.</p>
<p>This finding aligns with other emerging theories in the field. Some neuroscientists propose a “hypercuriosity” hypothesis. This theory suggests that the impulsivity seen in the condition might actually be a manifestation of an urgent need for information. In an evolutionary context, this drive to explore could have been essential for survival. In modern settings, it manifests as a desire to learn new things continuously.</p>
<p>The participants in the Norwegian study echoed this sentiment. They reported that their curiosity pushed them to seek new environments. One participant explained, “I enjoy trying new things, and if I do not get it right the first time, I will examine the possibility of trying a simpler method.” This persistence in learning was viewed as a distinct strength.</p>
<p>The fourth and final theme was resilience and growth. Participants described a process of personal development resulting from their struggles. Coping with the challenges of the condition fostered a sense of resilience. They felt better equipped to handle adversity because they had spent a lifetime managing difficulties.</p>
<p>This theme also encompassed increased empathy. Many participants felt that their own struggles helped them understand others better. This was particularly true for those working in education or healthcare. They reported an ability to connect with students or patients who faced similar challenges. One teacher noted, “As a teacher, ADHD helps me to understand students that have a learning disability.”</p>
<p>The process of receiving a diagnosis also contributed to this growth. For many, the diagnosis provided an explanation for lifelong difficulties. This understanding allowed them to practice self-acceptance. One participant shared that after the diagnosis, “I could with good reasons lower the expectations to myself and finally rest with a clear conscience.”</p>
<p>Despite these positive findings, the study has limitations. The sample consisted largely of high-functioning women. The experiences of men or those with more severe impairments might differ. The participants were also self-selected from a group already seeking treatment. This could influence their perceptions of the condition.</p>
<p>The researchers caution against romanticizing the disorder. The condition is associated with significant risks. These include higher rates of accidents, substance abuse, and relationship difficulties. Acknowledging strengths does not negate these serious challenges. The goal is to present a more balanced view of the individual.</p>
<p>This study provides a foundation for future research. Subsequent studies should aim to validate these qualitative findings with quantitative data. Researchers need to investigate how these strengths manifest in different contexts. Understanding the specific environments that allow these traits to flourish is essential.</p>
<p>The study, “<a href="https://bmjopen.bmj.com/content/13/10/e072052" target="_blank">Silver linings of ADHD: a thematic analysis of adults’ positive experiences with living with ADHD</a>,” was authored by Emilie S. Nordby, Frode Guribye, Tine Nordgreen, and Astri J. Lundervold.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/pro-inflammatory-diets-linked-to-accelerated-brain-aging-in-older-adults/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Pro-inflammatory diets linked to accelerated brain aging in older adults</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 22nd 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Recent research suggests that the food we eat may influence the biological aging of our brains. A study involving over 20,000 adults indicates that consuming a diet high in pro-inflammatory foods is associated with accelerated brain aging. This effect appears to be most pronounced in older adults. The findings were published in the <em><a href="https://doi.org/10.1007/s10654-025-01318-6" target="_blank">European Journal of Epidemiology</a></em>.</p>
<p>Chronic systemic inflammation is increasingly recognized as a contributing factor to various health issues, including neurodegenerative diseases. As people age, levels of inflammatory markers in the blood typically rise. Elevated levels of these markers often correlate with a higher risk of cognitive decline and conditions such as dementia. Scientists have established that diet is a primary way to regulate inflammation in the body.</p>
<p>Certain dietary patterns, such as the Western diet, are known to promote inflammation. These diets usually contain high amounts of red meat, processed foods, and high-fat dairy products. In contrast, diets rich in vegetables, fruits, and whole grains tend to lower inflammation. While previous studies have linked pro-inflammatory diets to memory problems and specific brain changes, the impact on overall brain aging remained less clear.</p>
<p>The researchers behind this new study aimed to fill this gap in knowledge. They sought to determine if a diet that promotes inflammation is linked to a comprehensive measure of brain health known as “brain age.” They also wanted to understand if this relationship varied based on a person’s chronological age or their genetic risk for dementia. Additionally, the team investigated whether systemic inflammation in the body acted as a bridge connecting diet to brain health.</p>
<p>“There is growing evidence to suggest that diet may play an important role in brain and cognitive health, and specifically that inflammation from an unhealthy diet could be one contributing mechanism. However, whether dietary inflammation is associated with a comprehensive measure of brain health has not been previously examined,” explained study author Michelle M. Dunk, a researcher at the Aging Research Center and Division of Neurogeriatrics at the Karolinska Institutet.</p>
<p>“One way to measure someone’s overall brain health is by measuring their ‘brain age’ from MRI scans. We can then calculate the “brain age gap” by comparing a person’s brain age to their chronological age. When someone’s brain looks older than their actual age, it may signal an increased risk for cognitive decline or dementia.”</p>
<p>Data for this analysis came from the UK Biobank, a large-scale biomedical database. The study included 21,473 participants who were between the ages of 40 and 70. None of these individuals had neurological disorders at the beginning of the study. This exclusion helped ensure that the results reflected typical aging rather than the progression of pre-existing brain diseases.</p>
<p>To assess dietary habits, the researchers used a web-based questionnaire called the Oxford WebQ. Participants reported their food and drink consumption from the previous twenty-four hours. This assessment was administered up to five times over a period of several years. The repeated assessments allowed for a more accurate estimation of habitual dietary intake than a single questionnaire would provide.</p>
<p>From this dietary data, the investigators calculated a Dietary Inflammatory Index score for each participant. This index is based on the intake of thirty-one specific nutrients and dietary components. The researchers looked at nutrients that either increase or decrease inflammation levels. For instance, components like fiber, omega-3 fatty acids, and certain vitamins are considered anti-inflammatory. Conversely, saturated fats and carbohydrates are often classified as pro-inflammatory.</p>
<p>Based on these scores, participants were divided into four distinct groups. Group 1 consisted of individuals with the most anti-inflammatory diets. Group 4 included those with the most pro-inflammatory diets. This categorization allowed the researchers to compare brain health outcomes across different levels of dietary quality.</p>
<p>Approximately nine years after the initial dietary assessments, participants underwent magnetic resonance imaging (MRI) scans of their brains. The researchers used these scans to estimate the biological age of each participant’s brain. They employed a machine learning model to analyze 1,079 different structural and functional measures from the MRI data. This advanced technology can detect subtle patterns of aging that might be missed by traditional analysis.</p>
<p>The team then calculated a metric called the “brain age gap.” This was done by subtracting the participant’s chronological age from their estimated brain age. A positive gap indicates that the brain appears older than expected for the person’s actual age. A negative gap suggests the brain appears younger and healthier. This metric serves as a proxy for general brain integrity.</p>
<p>The study also utilized blood samples collected at the beginning of the research period. The investigators measured several biomarkers of systemic inflammation, including C-reactive protein and white blood cell counts. These markers were combined into a composite inflammation score. This score allowed the team to test if inflammation in the body explained the link between diet and brain changes.</p>
<p>The results revealed a significant association between dietary habits and brain aging. Individuals who consumed the most pro-inflammatory diets had a larger brain age gap compared to those who ate anti-inflammatory diets. Specifically, those in the most pro-inflammatory group had brains that appeared about half a year older on average than those in the healthiest diet group. This suggests that poor dietary quality may accelerate the biological clock of the brain.</p>
<p>This association was dependent on the age of the participants. The link between a pro-inflammatory diet and older brain age was much stronger in adults aged 60 and older. In this older demographic, a pro-inflammatory diet was associated with an advanced brain age of nearly a full year. This implies that older adults might be more vulnerable to the negative effects of a poor diet.</p>
<p>The researchers also found that systemic inflammation played a measurable role in this relationship. Statistical analysis showed that the composite inflammation score accounted for about 8 percent of the association between diet and brain age. This finding provides evidence that pro-inflammatory foods may harm the brain in part by increasing overall inflammation in the body. The remaining effect is likely due to other mechanisms not captured by the inflammation score.</p>
<p>Genetic risk factors were also considered in the analysis. The researchers looked at polygenic risk scores for Alzheimer’s disease and the presence of the APOE4 gene. The association between diet and brain age was generally consistent regardless of genetic risk. However, the link appeared somewhat stronger in individuals who were not carriers of the APOE4 gene. This suggests that diet remains a relevant factor for brain health across different genetic profiles.</p>
<p>These findings align with previous research showing that healthy diets support cognitive function. Diets like the Mediterranean or MIND diet emphasize the same anti-inflammatory foods identified in this study. These include plant-based foods rich in polyphenols and healthy fats. The current study adds to the literature by using a global measure of brain structure rather than focusing on specific regions like the hippocampus.</p>
<p>“We found that those eating more pro-inflammatory diets had a significantly larger brain age gap – meaning they had an older, less healthy brain than would be expected based on their chronological age,” Dunk told PsyPost. “Those consuming the most pro-inflammatory diets had an advanced brain age by half a year compared to those with the most anti-inflammatory diets. This association was stronger in adults 60 years of age or older, suggesting that older adults may benefit most from an anti-inflammatory diet.”</p>
<p>“We were also able to confirm that systemic inflammation partially accounted for this association, by using a composite score of several inflammatory markers in the blood. In other words, those consuming more pro-inflammatory diets tended to have higher levels of circulating inflammatory biomarkers, which were in turn associated with older brain age. This finding suggests that an anti-inflammatory diet could potentially support brain health in part by promoting lower levels of inflammation in the body.”</p>
<p>But there are some limitations to this study. The UK Biobank participants are generally healthier and wealthier than the average population. This selection bias might affect how well the results apply to the general public. The study is also observational in nature. It can show a connection between diet and brain age but cannot definitively prove that diet causes the aging.</p>
<p>Another limitation involves the dietary assessment method. Self-reported data is subject to memory errors and may not perfectly reflect actual intake. Additionally, the Dietary Inflammatory Index focuses on nutrients rather than whole foods. People consume complex combinations of foods, not isolated nutrients. However, the index is a validated tool that generally correlates well with healthy eating patterns.</p>
<p>“The Dietary Inflammatory Index is calculated based on a person’s consumption of a variety of macronutrients, micronutrients, whole foods, and spices,” Dunk noted. “Because this index isn’t primarily based on whole foods, it might not be immediately obvious what a pro- or anti-inflammatory diet actually looks like.”</p>
<p>“Some of the anti-inflammatory components include dietary fiber (found in whole plant foods), omega 3 fats, onion, garlic, ginger, turmeric, green and black tea, isoflavones (soy), polyphenols (especially high in berries), and many vitamins and minerals. On the other hand, foods high in saturated fat, trans fat, or cholesterol tend to be pro-inflammatory.”</p>
<p>“This scoring is generally consistent with healthy dietary patterns we often hear about – like the Mediterranean, MIND, or whole food plant-based diets. Each of these dietary patterns are high in healthy plant foods (fruits, vegetables, whole grains, legumes, nuts, seeds) and limit foods like red and processed meat, deep-fried and ultraprocessed foods, and other animal-based foods to varying degrees, which are higher in saturated fat and cholesterol.”</p>
<p>Future research is needed to confirm these findings in more diverse populations. It would be beneficial to investigate whether switching to an anti-inflammatory diet can reverse accelerated brain aging. Clinical trials could provide stronger evidence of a causal relationship. For now, the results suggest that choosing anti-inflammatory foods may be a practical strategy for maintaining brain health as we age.</p>
<p>The study, “<a href="https://doi.org/10.1007/s10654-025-01318-6" target="_blank">The association between a pro-inflammatory diet and brain age in middle-aged and older adults</a>,” was authored by Michelle M. Dunk, Huijie Huang, Jiao Wang, Abigail Dove, Sakura Sakakibara, Jie Guo, Adrián Carballo-Casla, David A. Bennett, and Weili Xu.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/evidence-suggests-sex-differences-in-the-brain-are-ancient-and-evolutionary/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Evidence suggests sex differences in the brain are ancient and evolutionary</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 21st 2025, 22:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Differences between men and women in intelligence and behaviour have been <a href="https://doi.org/10.1186/s13293-022-00448-w">proposed and disputed for decades</a>.</p>
<p>Now, a growing body of scientific evidence shows hundreds of genes act differently in the brains of biologically male or female humans. What this means isn’t yet clear, though some of the genes may be linked to sex-biased brain disorders such as Alzheimer’s and Parkinson’s diseases.</p>
<p>These sex differences between male and female brains are established early in development, so they may have a role in shaping brain development. And they are found not only in humans but also in other primates, implying they are ancient.</p>
<h2>Gene activity in male and female brains</h2>
<p>Decades of research have <a href="https://doi.org/10.1186/s13293-022-00448-w">confirmed</a> differences between men and women in brain structure, function and susceptibility to mental disorders.</p>
<p>What has been less clear is how much of this is due to genes and how much to environment.</p>
<p>We can measure the influence of genetics by looking directly at the activity of genes in the brains of men and women. Now that we have the <a href="https://doi.org/10.1126/science.abj6987">full DNA sequence of the human genome</a>, it is comparatively easy to detect activity of any or all of the roughly 20,000 genes it contains.</p>
<p>Genes are lengths of DNA, and to be expressed their sequence must be copied (“transcribed”) into messenger RNA molecules (mRNA), which are then translated into proteins – the molecules that actually do the work that underpins the structure and function of the body.</p>
<p>So by sequencing all of this RNA (called the “transcriptome”) and lining up the base sequences to the known genes, we can measure the activity of every gene in a particular tissue – even an individual cell.</p>
<p>When scientists compared the transcriptomes in postmortem tissue samples from hundreds of men and women in 2017, they found surprisingly different patterns of gene activity. A third of our 20,000 genes were <a href="https://theconversation.com/not-just-about-sex-throughout-our-bodies-thousands-of-genes-act-differently-in-men-and-women-86613">expressed more in one sex than the other</a> in one or several tissues.</p>
<p>The strongest sex differences were in the testes and other reproductive tissues, but, surprisingly, most other tissues also showed sex biases. For instance, a subsequent paper showed very different RNA profiles in muscle samples from men and women, which correspond to <a href="https://doi.org/10.1016/j.xgen.2025.100915">sex differences in muscle physiology</a>.</p>
<p>A <a href="https://doi.org/10.1101/2025.06.30.661781">study</a> of brain transcriptomes published earlier this year revealed 610 genes more active in male brains, and 316 more active in female brains.</p>
<h2>What genes show sex bias in the brain?</h2>
<p>Genes on the sex chromosomes would be expected to show <a href="https://theconversation.com/differences-between-men-and-women-are-more-than-the-sum-of-their-genes-39490">different activity</a> between men (with an X chromosome and a Y chromosome) and women (with two X chromosomes). However, most (90%) sex-biased genes lie on ordinary chromosomes, of which both males and females have two copies (one from mum, one from dad).</p>
<p>This means some sex-specific signal must control their activity. Sex hormones such as testosterone and oestrogen are likely candidates, and, indeed, many sex-biased genes in the brain <a href="https://doi.org/10.1007/s10571-025-01536-2">respond to sex hormones</a>.</p>
<h2>How are sex differences established in the brain?</h2>
<p>Sex differences in brain gene activity appear early in the development of the foetus, long before puberty or even the formation of testes and ovaries.</p>
<p>Another 2025 <a href="https://doi.org/10.1016/j.xgen.2025.100890">study</a> examined 266 post mortem fetal brains and found more than 1,800 genes were more active in males and 1,300 in females. These sets of sex-biased genes overlapped with those seen in adult brains.</p>
<p>This points to direct genetic effects from genes on the sex chromosomes, rather than hormone-driven differences.</p>
<h2>Do these differences mean male and female brains work differently?</h2>
<p>It would be remarkable if sex differences in the activity of so many genes were not reflected in some major differences in brain function between men and women. But we don’t know to what extent, or which functions.</p>
<p>Some <a href="https://doi.org/10.1186/s13293-023-00515-w">patterns are emerging</a>. Many female-biased genes have been found to encode neuron-associated processes, whereas male-biased genes are more often related to traits such as membranes and nuclear structures.</p>
<p>Many genes are sex-biased <a href="https://doi.org/10.1186/s13293-023-00515-w">only in particular sub-regions of the brain</a>, which suggests they have a sex-specific function only in those regions.</p>
<p>However, differences in RNA levels don’t always produce differences in proteins. <a href="https://theconversation.com/whats-the-secret-of-genetic-equality-between-the-sexes-new-platypus-chromosome-research-may-hold-the-key-235214">Cells can compensate</a> to maintain protein balance, meaning that not all RNA differences have functional outcomes. Sometimes, developmental processes differ between sexes but lead to the same end result.</p>
<h2>Brain health</h2>
<p>Of particular interest is the finding of a relationship between sex biases and sex differences in the <a href="https://doi.org/10.1007/s10571-025-01536-2">susceptibility to some brain disorders</a>.</p>
<p>Many genes implicated in Alzheimer’s disease are <a href="https://doi.org/10.1186/s13293-024-00622-2">female-biased</a>, perhaps accounting for the doubled incidence of this disease in women. Studies on rodents imply that expression of the male-only SRY gene in the brain <a href="http://theconversation.com/the-sex-gene-sry-and-parkinsons-disease-how-genes-act-differently-in-male-and-female-brains-121764">exacerbates Parkinson’s disease</a>.</p>
<h2>Evolution of sex differences in brain gene function</h2>
<p>These sex-biased gene expression patterns are by no means unique to humans.<br>
They have also been found in the brains of <a href="https://doi.org/10.1186/s13293-023-00538-3">rats and mice</a> as well as in <a href="https://doi.org/10.1016/j.xgen.2024.100589">monkeys</a>.</p>
<p>The suites of male- and female-biased genes in monkeys overlap significantly with those of humans, implying that sex biases were established in a common ancestor 70 million years ago.</p>
<p>This suggests that natural selection favoured gene actions that promoted slightly different behaviours in our male and female primate ancestors – or perhaps even further back, in the ancestor of all mammals, or even all vertebrates.</p>
<p>In fact, sex differences in the expression of genes in the developing brain look to be ubiquitous in animals. They have been observed even in the <a href="https://www.science.org/doi/epdf/10.1126/sciadv.adv9106">humble nematode worm</a>.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/266352/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/hundreds-of-genes-act-differently-in-the-brains-of-men-and-women-266352">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-research-reveals-the-cognitive-hurdles-created-by-our-number-systems/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New research reveals the cognitive hurdles created by our number systems</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 21st 2025, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Most of us have little trouble working out how many millilitres are in 2.4 litres of water (it’s 2,400). But the same can’t be said when we’re asked how many minutes are in 2.4 hours (it’s 144).</p>
<p>That’s because the Indo-Arabic numerals we often use to represent numbers are base-10, while the system we often use to measure time is base-60.</p>
<p>Expressing time in decimal notation leads to <a href="https://doi.org/10.1098/rstb.2024.0225">an interaction between these two bases</a>, which can have implications at both the cognitive and cultural level.</p>
<p>Such base interactions and their consequences are among the important topics covered in a <a href="https://royalsocietypublishing.org/toc/rstb/2025/380/1937">new issue of the <em>Philosophical Transactions of the Royal Society</em> journal</a>, which I co-edited with colleagues Andrea Bender (University of Bergen), Mary Walworth (French National Centre for Scientific Research) and Simon J. Greenhill (University of Auckland).</p>
<p>The themed issue brings together work from anthropology, linguistics, philosophy and psychology to examine how humans conceptualize numbers and the numeral systems we build around them.</p>
<h2>What are bases, and why do they matter?</h2>
<p>Despite using numeral bases on a daily basis, few of us have reflected on the nature of these cognitive tools. As I explain in <a href="https://doi.org/10.1098/rstb.2024.0209">my contribution to the issue</a>, bases are special numbers in the numeral systems we use.</p>
<p>Because our memories aren’t unlimited, we can’t represent each number with its own unique label. Instead, we use a small set of numerals to build larger ones, like “three hundred forty-two.”</p>
<p>That’s why most numeral systems are structured around a <a href="https://doi.org/10.1098/rstb.2025.0321">compositional anchor</a> — a special number with a name that serves as a building block to form names for other numbers. Bases are anchors that exploit powers of a special number to form complex numerical expressions.</p>
<p>The English language, for example, uses a decimal system, meaning it uses the powers of 10 to compose numerals. So we compose “three hundred and forty-two” using three times the second power of 10 (100), four times the first power of 10 (10) and two times the zeroth power of 10 (one).</p>
<p>This base structure allows us to represent numbers of all sizes without overloading our cognitive resources.</p>
<h2>Languages affect how we count</h2>
<p>Despite the abstract nature of numbers, the degree to which numeral systems transparently reflect their bases has very concrete implications — and not just when we tell time. Languages with less transparent rules will take longer to learn, longer to process and can lead to more calculation and dictation errors.</p>
<p>Take French numerals, for example. While languages like French, English and Mandarin all share the same base of 10, most dialects of French have what could politely be called a <a href="https://blog.rosettastone.com/learn-french-numbers-1-100-with-these-french-counting-tips/">quirky way of representing numbers in the 70-99 range</a>.</p>
<p>Seventy is <em>soixante-dix</em> in French, meaning “six times 10 plus 10,” while 80 uses 20 as an anchor and becomes <em>quatre-vingts</em>, meaning “four twenties” (or “four twenty,” depending on the context). And 90 is <em>quatre vingt dix</em>, meaning “four twenty ten.”</p>
<p>French is far from being alone in being quirky with its numerals. <a href="https://www.thegermanproject.com/german-lessons/numbers">In German</a>, numbers from 13 to 99 are expressed with the ones before the tens, but numbers over 100 switch back to saying the largest unit first.</p>
<p>Even in English, the fact that “twelve” is said instead of “ten two” hides the decimal rules at play. Such irregularities spread far beyond languages.</p>
<h2>How bases shape learning and thought</h2>
<p>Base-related oddities are <a href="https://doi.org/10.1098/rstb.2024.0211">spread out across the globe</a> and have very real implications for <a href="https://doi.org/10.1098/rstb.2024.0221">how easily children learn what numbers are and how they interact with objects such as blocks</a>, and for <a href="https://doi.org/10.1177/1747021819881983">how efficiently adults manipulate notations</a>.</p>
<p>For example, <a href="https://doi.org/10.1080/03004430212127">one study found</a> that lack of base transparency slows down the acquisition of some numerical abilities in children, while another found similar negative effects on how quickly they <a href="https://doi.org/10.1111/j.1467-9280.1995.tb00305.x">learn how to count</a>.</p>
<p><a href="https://doi.org/10.1016/j.cognition.2020.104331">Another study</a> found that children from base-transparent languages were quicker to use large blocks worth 10 units to represent larger numbers (for example, expressing 32 using three large blocs and two small ones) than children with base-related irregularities.</p>
<p>While Mandarin’s perfectly transparent decimal structure can simplify learning, <a href="https://doi.org/10.1098/rstb.2024.0535">a new research method</a> suggests that children may find it easier to learn what numbers are if they are exposed to systems with compositional anchors that are smaller than 10.</p>
<p>In general, how we represent bases has very concrete cognitive implications, including <a href="https://doi.org/10.1098/rstb.2024.0221">how easily we can learn number systems</a> and <a href="https://doi.org/10.1098/rstb.2024.0217">which types of systems will tend to be used in which contexts</a>.</p>
<p>At a cultural level, base representation influences our ability to collaborate with scientists across disciplines and across cultures. This was starkly illustrated by the infamous Mars Climate Orbiter incident, when a mix-up between metric and imperial units <a href="https://everydayastronaut.com/mars-climate-orbiter">caused a $327 million spacecraft to crash into Mars in 1999</a>.</p>
<h2>Why understanding bases matters</h2>
<p>Numeracy — the ability to understand and use numbers — is <a href="https://www.nationalnumeracy.org.uk/what-numeracy/why-numeracy-important">a crucial part of our modern lives</a>. It has implications for our quality of life and for our ability to make informed decisions in domains like <a href="https://dx.doi.org/10.2139/ssrn.1561876">health and finances</a>.</p>
<p>For example, being more familiar with numbers will influence how easily we can choose between retirement plans, how we consider trade-offs between side-effects and benefits when choosing between medications or how well we understand how probabilities apply to our investments.</p>
<p>And yet many struggle to learn what numbers are, with millions suffering from <a href="https://www.edweek.org/teaching-learning/why-so-many-students-struggle-with-math-anxiety-and-how-to-help/2025/02">math anxiety</a>. Developing better methods for helping people learn how to manipulate numbers can therefore help millions of people improve their lives.</p>
<p>Research on the cognitive and cultural implications of bases collected in the <em>Philosophical Transactions of the Royal Society</em> journal can help make progress towards our understanding of how we think about numbers, marking an important step towards making numbers more accessible to everyone.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/268168/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/how-number-systems-shape-our-thinking-and-what-it-means-for-learning-language-and-culture-268168">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/lonely-children-have-an-increased-risk-of-dementia-and-cognitive-decline-in-adulthood-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Lonely children have an increased risk of dementia and cognitive decline in adulthood, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 21st 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>An analysis of the China Health and Retirement Longitudinal Study data found that individuals who reported experiencing loneliness as children had a 41% higher risk of dementia in adulthood. They also tended to experience faster age-related cognitive decline. The paper was published in <a href="http://www.doi.org/10.1001/jamanetworkopen.2025.31493"><em>JAMA Network Open</em></a>.</p>
<p>As people reach advanced age, their cognitive abilities start to decline. This is very subtle at first, but may become faster as a person ages. Processing speed and working memory are usually the most affected, while cognitive capacities based on experience and attained knowledge tend to be the least affected. This is called age-related cognitive decline. It is a normal part of aging and is not, by itself, indicative of any pathology.</p>
<p>In contrast, dementia is a pathological condition in which cognitive decline becomes fast and very severe. The most common form of dementia is Alzheimer’s disease, which involves progressive brain cell loss and disruptions in key neural networks. Unlike normal age-related cognitive decline, dementia affects core functions like language, reasoning, and memory stability. It results in extensive structural changes to the brain, including shrinkage of the hippocampus.</p>
<p>While there is no cure for dementia, research suggests that physical activity, cognitive engagement, and social interaction can help slow age-related decline and reduce dementia risk. As human lifespans increase, more and more people reach an age in which they experience cognitive decline and dementia. This has made research into factors that affect the risk of dementia and the pace of age-related cognitive decline a priority for many.</p>
<p>Study author Jinqi Wang and colleagues wanted to investigate whether childhood loneliness is associated with cognitive decline and dementia risk in adulthood and whether adult loneliness mediates or modifies these associations. They defined childhood loneliness as self-reported frequent feelings of loneliness and the absence of close friendships before the age of 17.<br>
These researchers analyzed data from the China Health and Retirement Longitudinal Study (CHARLS). </p>
<p>The CHARLS is a nationwide study of Chinese adults aged 45 years and older. The study recruited a total of 17,707 participants from 28 provinces in China back in 2011 and collected follow-up data periodically. While data was collected through 2020, the researchers restricted their analysis to follow-ups through 2018 to avoid bias related to the COVID-19 pandemic. Information about childhood loneliness was collected through face-to-face interviews during the 2014 data collection.</p>
<p>As some participants dropped out of the study and some did not participate in all surveys and interviews, the authors of this study used data from 13,592 participants who had all the needed results. Their average age at the start of the study was 58 years. Approximately 53% of them were women.</p>
<p>Results showed that, compared to participants who did not report being lonely in childhood, participants who reported being lonely as children tended to experience faster cognitive decline. They also had a 41% higher risk of dementia compared to their peers who were not lonely as children.</p>
<p>These associations remained even after study authors controlled for loneliness in adulthood. However, adult loneliness was a possible mediator of a small part (8.5%) of the association between childhood loneliness and cognitive decline. It was also a possible mediator of 17.2% of the association between childhood loneliness and dementia.</p>
<p>“These findings suggest that childhood loneliness may serve as an independent risk factor for later-life cognitive decline and dementia, highlighting the need for early interventions to mitigate its long-term implications for cognitive health throughout the life course,” study authors concluded.</p>
<p>The study contributes to the scientific understanding of the risk factors for dementia and accelerated cognitive decline in advanced age. However, it should be noted that childhood loneliness data was not collected in childhood, but was based on self-reports participants gave when they were already over 45 years of age. This leaves room for recall bias to have affected the results.</p>
<p>The paper, “<a href="http://www.doi.org/10.1001/jamanetworkopen.2025.31493">Childhood Loneliness and Cognitive Decline and Dementia Risk in Middle-Aged and Older Adults,</a>” was authored by Jinqi Wang, Danyang Jiao, Xiaoyu Zhao, Yixing Tian, Haibin Li, Xia Li, Chen Sheng, Lixin Tao, Hui Chen, Zhiyuan Wu, and Xiuhua Guo.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/neuroticism-is-associated-with-reduced-brain-engagement-in-social-settings/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Neuroticism is associated with reduced brain engagement in social settings</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 21st 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in <em><a href="https://doi.org/10.1016/j.brainres.2025.149905" target="_blank">Brain Research</a></em> has found that specific personality traits fundamentally alter how the human brain processes information when collaborating on a task. By measuring electrical brain activity, investigators discovered that traits such as conscientiousness and neuroticism drive distinct neural patterns during moments of shared attention. These findings suggest that the automatic mental coordination required for social interaction is not uniform across all people but is instead shaped by individual disposition.</p>
<p>Shared attention is the intuitive awareness that another person is focusing on the same object or goal that you are. This mechanism acts as a bedrock for human social coordination. It allows individuals to align their mental states with others to achieve common objectives.</p>
<p>Previous psychological research has established that humans engage in a process called co-representation. This means that when two people work on a task together, they automatically simulate their partner’s actions in their own minds. This happens even when the partner’s actions are not directly relevant to one’s own duties.</p>
<p>Most prior studies on this phenomenon treated all participants as relatively similar in their cognitive processing. The influence of individual personality differences on this automatic social mechanism remained largely unknown. This gap in knowledge prompted the current investigation.</p>
<p>The researchers involved in this work sought to determine if stable personality traits modulate the neural engagement associated with shared attention. The team included Yuzhan Hang, Wei Wu, and Xiaosong He from the University of Science and Technology of China. They were joined by Satoshi Shioiri from the Advanced Institute of So-Go-Chi Informatics at Tohoku University in Japan.</p>
<p>To measure these effects, the team recruited 50 university students to participate in a controlled laboratory experiment. The researchers divided the participants into pairs. They balanced the pairs by sex to ensure equal representation.</p>
<p>The participants completed a standard personality assessment known as the Big Five Inventory-2. This questionnaire evaluates five major domains of personality. These domains are Openness, Conscientiousness, Extraversion, Agreeableness, and Neuroticism.</p>
<p>The assessment also breaks these broad domains down into smaller, more specific components called facets. For example, the domain of Conscientiousness includes the specific facet of Responsibility. The domain of Neuroticism includes the facet of Depression.</p>
<p>Following the personality assessment, the pairs entered a soundproof room designed to minimize distractions. They sat at desks placed back-to-back. A curtain divided the room so they could not see or hear each other directly.</p>
<p>Despite this physical separation, the participants were engaged in a shared cognitive environment. The researchers employed a test known as the joint Flanker task. This is a classic psychological paradigm adapted for social contexts.</p>
<p>In this task, participants viewed a screen displaying a row of five letters. Their goal was to identify the middle letter, known as the target. The surrounding letters served as distractors.</p>
<p>The experimenters assigned specific target letters to each participant. For one person, the targets might be H and K. For their partner, the targets might be S and C.</p>
<p>The crucial element of the experiment was that participants knew their partner’s target letters. The screen would sometimes show a target flanked by letters that were relevant to the partner’s task. These were termed incongruent trials.</p>
<p>In other trials, the target was flanked by neutral letters unrelated to either person. By comparing reaction times between these conditions, the researchers could measure the “joint Flanker effect.” This effect quantifies how much the partner’s potential task interfered with the participant’s own processing.</p>
<p>While the participants performed this task, they wore caps equipped with electrodes. These devices recorded electroencephalography (EEG) data. This allowed the team to track the timing of electrical activity in the brain with millisecond precision.</p>
<p>The researchers focused on two specific electrical signals, or event-related potentials. The first signal is called the N2 component. The N2 appears approximately 200 to 350 milliseconds after a stimulus is presented.</p>
<p>Neuroscientists associate the N2 component with conflict monitoring. It represents the brain’s automatic detection of a mismatch or a need for control. A stronger N2 generally signals that the brain is working harder to manage conflicting information.</p>
<p>The second signal analyzed was the P3 component. This wave occurs roughly 300 to 500 milliseconds after the stimulus. The P3 is linked to the allocation of attentional resources.</p>
<p>A larger P3 amplitude typically indicates that the brain is engaging more significant resources to process the task. It is often associated with inhibitory control, which is the ability to suppress an incorrect response. The researchers analyzed how these brain signals correlated with the participants’ personality scores.</p>
<p>The behavioral data confirmed that the participants were indeed engaging in shared attention. Reaction times were significantly slower when the distractors matched the partner’s target letters. This delay indicates that participants were mentally representing their partner’s task rules alongside their own.</p>
<p>The EEG data revealed that personality traits significantly predicted how the brain handled this social interference. The domain of Conscientiousness showed a robust connection to the P3 component. Individuals who scored high in Conscientiousness displayed increased P3 amplitudes.</p>
<p>This effect was most pronounced during trials that required inhibitory control. The researchers suggest this reflects a heightened state of task engagement. Conscientious individuals appear to allocate more neural resources to maintain focus on shared goals.</p>
<p>Further analysis looked deeper into the specific facets of personality. The facet of Responsibility was the primary driver of this effect. People who identified strongly with being reliable and dependable showed the strongest P3 responses.</p>
<p>This implies that for responsible individuals, a shared task triggers a neural response similar to a sense of duty. Their brains actively work harder to manage the social context and adhere to the rule set. This happens automatically and within a fraction of a second.</p>
<p>A different pattern emerged regarding the trait of Neuroticism. This domain was significantly associated with the N2 component. High scores in Neuroticism correlated with variations in the amplitude of this conflict-monitoring signal.</p>
<p>Specifically, the researchers found a negative relationship involving the Depression facet of Neuroticism. Individuals with higher scores on this facet exhibited attenuated, or reduced, N2 amplitudes. This reduction occurred particularly when they needed to withhold a response.</p>
<p>This finding might seem counterintuitive at first glance. One might expect anxious or neurotic individuals to be hyper-reactive. The researchers offer a nuanced interpretation based on theories of emotional regulation.</p>
<p>They propose that this reduced neural engagement may act as a defensive coping mechanism. Social situations with conflicting information can be mentally overstimulating for individuals high in neuroticism. The brain may instinctively withdraw resources from the conflict-monitoring process to prevent cognitive overload.</p>
<p>By dampening the N2 response, these individuals may be shielding themselves from the stress of the social interference. This suggests a form of neural disengagement. It allows them to function in the task without becoming overwhelmed by the presence of the partner’s conflicting signals.</p>
<p>The study also utilized a machine learning technique called a Random Forest model. This advanced statistical approach helped verify the strength of these relationships. It confirmed that Neuroticism was the most consistent predictor of N2 amplitude variations.</p>
<p>The machine learning analysis also highlighted the importance of the “facets” over the broad personality domains. The specific sub-traits often predicted brain activity more accurately than the general categories. For instance, the Emotional Volatility facet also appeared as a strong predictor for neural responses.</p>
<p>This granularity supports the idea that personality is hierarchically structured. Broad labels like “Extraversion” or “Agreeableness” can mask specific neural drivers. Looking at the underlying facets provides a clearer picture of the biological reality.</p>
<p>The study highlights that social attention is not a single, universal process. It is a personalized experience rooted in neural architecture. A highly conscientious person and a highly neurotic person may sit in the same room doing the same task, but their brains solve the problem of coordination differently.</p>
<p>There are several limitations to this study that warrant mention. The sample size consisted of 50 participants. While this is typical for complex EEG studies, it is relatively small for drawing broad population-level conclusions.</p>
<p>The study was also exploratory in nature. The researchers did not start with a rigid hypothesis but rather sought to uncover relationships in the data. This means the results should be replicated in independent samples to ensure their reliability.</p>
<p>The experimental design did not include a “separated” condition where participants performed the task alone for comparison. The researchers relied on the difference between trial types to infer social attention. Future work could benefit from including a solitary control condition to isolate social effects more precisely.</p>
<p>Another avenue for future research lies in neural synchronization. This involves measuring how the brainwaves of two partners align with each other over time. It is possible that people with similar personality profiles might exhibit greater synchronization.</p>
<p>The current findings provide a foundation for a field known as personality neuroscience. They move beyond simple behavioral observations. They offer a window into the millisecond-by-millisecond neural computations that define who we are in relation to others.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.brainres.2025.149905" target="_blank">Personality and social attention: Trait-driven differences in neural engagement</a>,” was authored by Yuzhan Hang, Wei Wu, Satoshi Shioiri, and Xiaosong He.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/maga-republicans-are-more-likely-to-justify-political-violence-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">MAGA Republicans are more likely to justify political violence, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 21st 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new survey analysis indicates that while adherents of the “Make America Great Again” movement are more likely to view political violence as justified, they are generally not more willing to engage in it themselves compared to other groups. The research suggests that widespread endorsement of such acts could nonetheless foster an environment where violence is more likely to occur. These findings were published in the journal <em><a href="https://doi.org/10.1186/s40621-025-00633-6" target="_blank" rel="noopener">Injury Epidemiology</a></em>.</p>
<p>Political tension in the United States has raised concerns among public health experts regarding the potential for conflict. The researchers approaching this issue view violence not just as a criminal matter but as a public health crisis that requires evidence-based prevention strategies. By identifying the specific characteristics and beliefs of those who support aggressive political action, they aim to develop interventions to reduce the risk of harm.</p>
<p>The team conducting this investigation includes Garen J. Wintemute and colleagues from the Violence Prevention Research Program at the University of California, Davis. They previously established the “Life in America Survey” to track these trends over time. This longitudinal project seeks to understand who supports political violence and why, particularly as the 2024 federal election approached.</p>
<p>The data for this analysis came from the third wave of a large, nationally representative survey conducted between May and June 2024. The researchers utilized the Ipsos KnowledgePanel, which recruits participants through address-based sampling to ensure they accurately reflect the American population. The final sample included 8,896 adults who completed the questionnaire online.</p>
<p>Participants self-identified their political affiliations. They indicated whether they considered themselves “MAGA Republicans” or supporters of the movement. The researchers then compared the views of these groups against other Republicans and those who were neither Republicans nor MAGA supporters. The study utilized statistical weighting to adjust for demographics such as age, gender, and race, ensuring the findings represented the broader United States population.</p>
<p>The study assessed respondents’ views on whether force was justified to achieve political goals. The results showed that MAGA Republicans were substantially more likely to endorse violence to effect social change. Approximately 56 percent of this group considered violence usually or always justified to advance at least one of 21 specific political objectives.</p>
<p>In contrast, only about 25 percent of non-MAGA non-Republicans held similar views. The specific objectives that garnered support included stopping illegal immigration or preserving an “American way of life.” The researchers observed a consistent pattern where MAGA Republicans supported violence for a greater number of specific causes than other groups.</p>
<p>The survey also asked about the prospect of civil conflict. MAGA Republicans were more likely to agree that civil war in the United States is inevitable in the coming years. They were also more likely to agree with the statement that the country needs a civil war to set things right. Specifically, about 10 percent of MAGA Republicans predicted a civil war soon, compared to roughly 5 percent of those outside the movement and party.</p>
<p>Despite this high level of abstract support for violence, the study found a disconnect regarding personal participation. MAGA Republicans were not statistically more likely than others to say they would personally shoot, injure, or threaten someone to advance a political goal. The willingness to commit these acts remained low across all surveyed groups.</p>
<p>However, there was a distinction regarding firearms. In scenarios where they felt political violence was justified, MAGA Republicans were more likely to predict they would be armed with a gun. They were also more likely to say they would carry that gun openly. This group reported higher rates of general firearm ownership and recent purchasing behavior compared to the other cohorts.</p>
<p>The investigation explored psychological and social traits associated with these views. The analysis revealed that MAGA Republicans were more likely to endorse authoritarian statements. For instance, they more frequently agreed that having a strong leader is more important than maintaining a democracy. They were also more likely to support suspending Congress to allow a leader to solve problems without political interference.</p>
<p>The researchers also found strong correlations between MAGA affiliation and specific biases. This group was more likely to score high on measures of racism, hostile sexism, and xenophobia. They also expressed higher levels of support for Christian nationalist beliefs and the QAnon movement.</p>
<p>Belief in conspiracies was another distinguishing factor. MAGA Republicans were more likely to agree that the government conceals its involvement in acts of terrorism on domestic soil. They also more frequently attributed the spread of viruses to deliberate, concealed efforts by organizations.</p>
<p>The researchers identified a small but distinct group of respondents who were not Republicans but still identified as MAGA supporters. This cohort tended to be younger and included more women than the Republican MAGA group. While small in number, these individuals often expressed levels of support for violence that exceeded those of their Republican counterparts.</p>
<p>These 2024 results align closely with the team’s <a href="https://www.psypost.org/pro-trump-maga-republicans-much-more-likely-to-endorse-delusional-and-pro-violence-statements-study-finds/" target="_blank" rel="noopener">previous findings from 2022</a>. That earlier study, published in <em>PLOS One</em>, also identified a specific subset of Republicans who endorsed political violence at higher rates than the general public. In the 2022 analysis, the researchers defined MAGA Republicans based on their voting history and denial of the 2020 election results.</p>
<p>The 2024 study allowed participants to self-identify with the movement, yet the patterns remained consistent. Both studies found that while endorsement of violent rhetoric was high, personal willingness to engage in violence was low. The 2022 data had similarly highlighted that MAGA Republicans held distinct views on race and democratic norms compared to other conservatives.</p>
<p>For example, the earlier study found that over half of MAGA Republicans believed in the “great replacement” theory. The new data reinforces that this group remains a distinct minority with views that diverge significantly from other Republicans and non-Republicans. The persistence of these trends over two years suggests that these attitudes are stable rather than transient.</p>
<p>The researchers acknowledge certain limitations inherent in survey-based research. Because the study relies on self-reporting, respondents might withhold their true willingness to commit violence due to social stigma. This could lead to an underestimation of the actual risk posed by these groups.</p>
<p>Additionally, the study captures a snapshot in time. The survey was fielded shortly before the conviction of Donald Trump on felony charges, though a sensitivity analysis suggested this did not skew the results. The researchers noted that cross-sectional data cannot prove that MAGA affiliation causes these beliefs, only that they are strongly linked.</p>
<p>Future research aims to understand the small but distinct group of non-Republicans who support the MAGA movement. The team also plans to investigate whether approval of extremist groups changes over time. A primary question remains whether those who justify violence but refuse to participate might be persuaded to discourage others from acting.</p>
<p>The researchers emphasize that while few individuals are willing to commit violence, the broader climate of justification is dangerous. It may enable the small fraction of people who are willing to act to feel supported. Prevention efforts may need to focus on dissuading the supporters, as they could influence the potential perpetrators.</p>
<p>The study, “<a href="https://doi.org/10.1186/s40621-025-00633-6" target="_blank" rel="noopener">The MAGA movement and political violence in 2024: findings from a nationally representative survey</a>,” was authored by Garen J. Wintemute, Bradley Velasquez, Sonia L. Robinson, Elizabeth A. Tomsich, Mona A. Wright and Aaron B. Shev.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/dark-personality-traits-are-associated-with-poorer-lie-detection-among-incarcerated-individuals/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Dark personality traits are associated with poorer lie detection among incarcerated individuals</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 21st 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Recent research published in <em><a href="https://doi.org/10.1002/acp.70132" target="_blank" rel="noopener">Applied Cognitive Psychology</a></em> provides new evidence regarding the relationship between personality traits and the ability to lie and detect lies. The study compares individuals from the general community with an incarcerated population to determine if a “deception-general” ability exists across different environments. The findings suggest that while effective liars in the community are also better at detecting lies, this association does not hold for those in prison.</p>
<p>Deception is a multifaceted behavior that involves both the production of false information and the detection of deceit in others. Most scientific inquiries into this subject have relied on passive tasks, such as asking participants to watch videos of strangers lying. This approach often fails to capture the dynamic, interpersonal nature of real-world deception. The authors of the new study sought to address this gap by using an interactive task that requires participants to both lie and judge the veracity of others in real time.</p>
<p>A primary motivation for this research was to investigate the “deception-general” ability hypothesis. This hypothesis posits that the cognitive and social skills required to craft a convincing lie are essentially the same skills needed to spot a lie. The researchers also aimed to explore how “dark” personality traits influence these abilities. There is a common assumption that manipulative individuals, or those with antisocial traits, possess a superior ability to detect manipulation in others.</p>
<p>“Deception is a fundamental aspect of human communication, yet it is mostly studied in controlled, artificial settings that fail to capture the complexity of real-world exchanges,” explained study authors Laura Visu-Petra, a professor and coordinator of <a href="https://riddlelab.ro/" target="_blank" rel="noopener">the RIDDLE Lab</a> at Babeș-Bolyai University, and Andreea Turi, who works in the rehabilitation department of a Romanian maximum security prison.</p>
<p>“We wanted to examine the dynamics of deception as it occurs in actual social interactions, where people lie, are being deceived, and constantly need to adjust their behaviors and judgments accordingly. Using an ecological scenario, we explored whether individual lie production and lie detection skills are related, revealing a so-called ‘general deception ability.'”</p>
<p>“We also wondered whether this ability would be enhanced in hostile environments such as prison settings, where one can have more opportunities to encounter and practice deception compared to everyday settings. Finally, we aimed to explore whether high individual levels of aversive traits, such as the Dark Factor, alexithymia, and aggression, would further fine-tune this ability.”</p>
<p>The research team recruited a total of 140 participants for the study. The sample consisted of 60 individuals from the general community and 80 incarcerated individuals from a maximum-security prison. The community participants were volunteers recruited from a workplace setting. The prison sample included individuals with varying criminal histories, including theft, drug-related offenses, and violent crimes.</p>
<p>Data collection occurred in three distinct phases. In the first phase, participants completed interviews to provide demographic data and background information. The second phase involved a battery of self-report questionnaires designed to measure personality traits. Both groups completed the Dark Factor inventory, which assesses traits such as callousness, deceitfulness, and narcissistic entitlement.</p>
<p>The prison group completed additional assessments that were not administered to the community group. These included the Self-Report Psychopathy Scale and the Aggression Questionnaire. They also completed the Toronto Alexithymia Scale. Alexithymia is a psychological construct characterized by a difficulty in identifying and describing one’s own emotions.</p>
<p>The third phase involved the Deceptive Interaction Task (DeceIT). Participants were organized into small groups of six. They first completed a survey to establish their genuine opinions on various controversial topics. During the game, participants took turns acting as “Senders” and “Receivers.”</p>
<p>Senders drew cards that instructed them to either tell the truth or lie about their opinion on a topic. They spoke for approximately 20 seconds to convince the group. Receivers then rated the veracity of the statement on a scale from “not at all likely” to “very likely” to be true. This design allowed the researchers to obtain objective measures of lie production ability and lie detection accuracy for every participant.</p>
<p>“We relied on an ingenious round-robin paradigm (DeceIT) developed by Wright and collaborators (2012), now applied for the first time to a prison context,” Visu-Petra and Turi told PsyPost. Firstly, inmates or age- and education-matched community participants privately expressed their real opinions about multiple controversial topics, such as ‘Smoking should be banned from all public places.'”</p>
<p>“Next, placed in a small group of previously unacquainted people, each person took turns between a Sender role (telling the truth or lying to the group about their real opinion) and a Judge role (assessing if other Senders were telling lies or truths). An individual’s proficiency score in producing credible lies or in detecting deception was computed after several iterations.”</p>
<p>The researchers analyzed the data using Signal Detection Theory. This statistical framework allows for the separation of actual detection accuracy from response bias. Response bias refers to a general tendency to categorize statements as true or false regardless of their actual content.</p>
<p>In the community sample, the data supported the existence of a deception-general ability. The analysis showed a correlation between the ability to produce lies that were hard to detect and the ability to accurately detect lies told by others. This suggests that among the general public, the skills utilized for lying and lie detection are linked.</p>
<p>The results from the prison sample presented a different pattern. There was no significant correlation between lie production and lie detection abilities among incarcerated participants. Being a skilled liar in prison did not translate to being a skilled lie detector. This finding indicates that the high-stakes environment of a prison may disrupt the relationship between these two skills.</p>
<p>“The ‘general deception ability’ was confirmed in the general population, although the association between the two abilities was not particularly strong, similar to initial findings by Wright and collaborators (2012),” the researchers explained. “However, this association appears to be highly context-dependent, since we found no evidence for it in the prison sample. Moreover, inmates were lower in their deception detection accuracy.”</p>
<p>The researchers also examined the influence of the Dark Factor of personality. The Dark Factor represents a core disposition toward minimizing the welfare of others for personal gain. In the prison sample, higher scores on the Dark Factor were associated with lower accuracy in detecting lies. Specific themes such as Callousness, Deceitfulness, and Narcissistic Entitlement were negatively correlated with detection performance.</p>
<p>The additional measures administered to the prison sample provided further context. The analysis revealed that lie detection accuracy was negatively correlated with psychopathy and aggression. Inmates who scored higher on measures of interpersonal manipulation and physical aggression were less accurate at distinguishing truths from lies.</p>
<p>Alexithymia also emerged as a significant factor in the prison findings. Participants who struggled to identify or describe their own feelings performed worse on the lie detection task. This supports the idea that emotional awareness is a component of detecting deception. Without the ability to process their own emotional reactions, individuals may miss the “gut feelings” or subtle affective signals that often accompany the detection of a lie.</p>
<p>In terms of lie production, alexithymia also appeared to be a hindrance. Prisoners who had difficulty describing their feelings were less successful at producing convincing lies. This suggests that crafting a credible deception requires a degree of emotional intelligence and expressiveness.</p>
<p>“Aversive personality traits were found to play a role in shaping deception abilities within the prison context, but not in the general community,” Visu-Petra and Turi said. “Surprisingly, the effect was not in the expected direction. Higher levels of dark factor themes (especially deceitfulness), aggression, and alexithymia (difficulty identifying and describing emotions) were actually negatively related to deception detection proficiency.”</p>
<p>“These traits may affect the emotional awareness and social sensitivity needed to read subtle cues. In high-stress environments like prisons, hostile tendencies may be amplified, significantly reducing attention and processing capacity, or simply revealing a less optimal decoding of social cues that interfere with deception detection.”</p>
<p>Both the community and prison groups exhibited a “truth bias.” This means that participants were generally more likely to judge a statement as true than as false. This finding contradicts some previous research suggesting that prisoners might possess a “lie bias” due to the low-trust environment of a correctional facility. The authors propose that truth bias may serve an adaptive social function even in prison, facilitating basic cooperation and reducing cognitive load.</p>
<p>Since data collection overlapped with the COVID-19 pandemic, the researchers also assessed the impact of wearing face masks. Statistical analysis showed that wearing a mask did not significantly impair lie detection accuracy. However, it was associated with a stronger truth bias. The lack of full facial cues may have led participants to default to assuming honesty.</p>
<p>Sociodemographic factors played a minor role in the results. Age showed a weak association with detection skills, with younger prisoners performing slightly better. Gender did not predict performance in either group. Among prisoners, those with a history of recidivism were slightly better at detecting lies, lending some support to the idea that prolonged exposure to criminal environments might sharpen these skills.</p>
<p>“Overall, the findings highlight that lie detection and production are influenced by the complex interplay between individual traits, environmental, and contextual factors,” Visu-Petra and Turi explained. “This research not only contributes to the theoretical understanding of deception but also has practical implications for improving interrogation strategies, rehabilitation programs, and interpersonal dynamics in forensic and correctional settings. For instance, psychological interventions aimed at reducing aggression and hostile attribution biases should incorporate mistaken inferences of inmates who perceive they are being lied to when it’s not the case, a recurring problem in incarcerated populations.”</p>
<p>The researchers noted some limitations to the study. The findings are correlational, which prevents the determination of cause and effect. It is unclear whether dark traits cause poor lie detection or if environmental factors influence both. The reliance on self-report measures for personality traits also introduces the possibility of response bias, particularly regarding socially undesirable traits.</p>
<p>Looking ahead, “we aim to incorporate qualitative methods to better understand prisoners’ attitudes toward deception and to identify the actual cues they rely on when detecting lies,” Visu-Petra and Turi said. “Ultimately, we hope to bring to light the intricate mechanisms underlying deception production and detection, explore how environmental and individual factors interact, and generate insights that could inform interventions to improve social and emotional functioning, as well as ethical practices in forensic and correctional settings.”</p>
<p>The study, “<a href="https://doi.org/10.1002/acp.70132" target="_blank" rel="noopener">Behind Bars and Lies: Dark Personality Traits, Lying, and Detecting Deception in the Prison Population Versus the General Community</a>,” was authored by Andreea Turi, Mircea Zloteanu, Daria Mihaela Solescu, Mădălina-Raluca Rebeleș, and Laura Visu-Petra.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/groundbreaking-new-research-challenges-20-year-old-theory-on-dopamine-and-obesity/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Groundbreaking new research challenges 20-year-old theory on dopamine and obesity</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 21st 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in <em><a href="https://doi.org/10.1038/s41380-025-02960-y" target="_blank" rel="noopener">Molecular Psychiatry</a></em> suggests a new way to understand the relationship between body fat and the brain chemical dopamine. The findings indicate that individuals with higher adiposity may not have fewer dopamine receptors, as a long-standing theory proposed, but rather higher baseline levels of dopamine in a key reward region of the brain. This work could reframe scientific understanding of the neurobiology associated with body weight regulation.</p>
<p>Dopamine is a brain chemical that helps regulate motivation, reward, and learning. For decades, researchers have studied whether people with obesity have differences in dopamine signaling, particularly in the striatum, a key part of the brain’s reward system.</p>
<p>A landmark 2001 study using positron emission tomography, or PET, found that individuals with obesity appeared to have lower availability of dopamine D2 receptors in the brain. This was widely interpreted as evidence that people with obesity might be less responsive to rewards and might overeat to compensate.</p>
<p>This concept, often called the “reward deficiency” hypothesis, suggested that people with obesity might have a dampened dopamine system, similar in some ways to individuals with substance use disorders. This could theoretically drive them to overeat to compensate for a weaker reward signal.</p>
<p>However, the scientific literature has been filled with conflicting reports. Some human studies supported the original finding of a negative association, while others found a positive link, and still others reported no association at all. These discrepancies may be a result of variations in experimental design, such as whether participants were fasted or fed before brain scans, and the use of different chemical tracers for imaging.</p>
<p>To address this ambiguity, a team of researchers at the National Institutes of Health, led by senior author Kevin D. Hall, designed a highly controlled study. Hall, a leading researcher in nutrition and metabolism, has been involved in many prominent studies on diet and weight regulation. He and his collaborators designed the experiment to eliminate many of the confounding variables that may have affected earlier research.</p>
<p>Their primary goal was to measure dopamine receptor availability in the same individuals under standardized conditions using two different imaging tools. This approach was intended to help disentangle the factors that may have contributed to previous inconsistent findings.</p>
<p>The research team recruited 54 adults spanning a wide range of body mass indexes, from 20 to 44. Participants were admitted to an inpatient clinical center for several days, a step that allowed for precise control over their environment and diet. This inpatient setting allowed the scientists to ensure all participants consumed a standardized, weight-stabilizing diet for several days leading up to their brain scans, removing diet as a potential confounding variable.</p>
<p>On separate days, each participant underwent two positron emission tomography, or PET, scans. PET scans use radioactive tracers that bind to specific targets in the brain, allowing scientists to visualize and quantify them. The team used two different tracers that both bind to dopamine type-2 receptors, but they do so with different properties.</p>
<p>One tracer, [11C]raclopride, has a relatively low affinity for dopamine receptors, meaning it is more easily displaced by the brain’s own naturally occurring dopamine. Its signal is sensitive not only to the number of available receptors but also to the amount of ambient dopamine competing for those same binding sites. The other tracer, [18F]fallypride, binds more tightly, making its signal less influenced by ambient dopamine levels and more reflective of the sheer number of available receptors.</p>
<p>When using [11C]raclopride, the researchers observed a negative linear relationship between body mass index and receptor binding potential. In other words, as a person’s body fat increased, the amount of the tracer that could bind in the brain’s striatum decreased. This result was consistent with the original studies that gave rise to the reward deficiency hypothesis.</p>
<p>In the same participants, however, the scans using the higher-affinity [18F]fallypride tracer showed no significant relationship between body mass index and receptor binding potential. The quantity of available receptors did not appear to change systematically with body fat. The correlation coefficients from the two different scans were statistically different from one another.</p>
<p>The difference between the two scans provides a key insight into the underlying neurochemistry. Because the number of receptors seemed stable, as suggested by the [18F]fallypride results, the researchers inferred that the reduced binding of [11C]raclopride in individuals with higher body fat was likely caused by greater competition from their own dopamine. This suggests that adiposity is associated with higher baseline, or tonic, levels of dopamine.</p>
<p>To explore this further, the team created a statistical index to estimate this dopamine tone by comparing the results from the two tracers for each individual. This index showed a positive association with body mass index, lending support to their interpretation. The relationship was particularly evident in the dorsal striatum, a part of the reward circuit involved in habit formation.</p>
<p>These results suggest that people with more body fat tend to have greater dopamine tone in brain regions associated with reward. That finding could help explain some behaviors observed in obesity, such as increased motivation for food or other rewards.</p>
<p>Past theories often emphasized a dopamine deficit model, in which lower receptor availability made individuals less sensitive to rewards. The current study supports a different view. Higher dopamine tone might actually make reward-related stimuli more salient or harder to resist.</p>
<p>However, the study’s design is cross-sectional, which means it captures a single moment in time and cannot establish causality. It is not possible to determine whether increased adiposity leads to heightened dopamine tone or if elevated dopamine tone might contribute to weight gain over time. Additionally, the measurement of dopamine tone was an inference derived from the PET scan data, not a direct quantification of the chemical itself.</p>
<p>Future research could investigate the mechanisms behind this association and explore whether changes in diet or weight can alter dopamine tone over time. The findings open new avenues for understanding how the brain’s reward system interacts with metabolism and body weight regulation, potentially moving the field beyond the simple model of receptor deficiency.</p>
<p>While the study advances understanding of the brain’s role in obesity, it arrives at a time of concern for scientific independence. The study’s senior author, Kevin D. Hall, has taken early retirement from his position at the National Institutes of Health.</p>
<p>He stated his departure was driven by multiple instances where he felt his work was being censored by federal officials. Hall described an incident <a href="https://www.nytimes.com/2025/04/16/well/kevin-hall-nutrition-retirement-nih-censorship-rfk-maha.html" target="_blank" rel="noopener">in which he was barred from speaking freely</a> with reporters about a study whose results might have been seen as contradicting Health and Human Services Secretary Robert F. Kennedy Jr.’s stance on food addiction.</p>
<p>“We experienced what amounts to censorship and controlling of the reporting of our science,” Hall said in an interview. He also noted that routine approvals for his work were being elevated to political appointees, and he expressed concern that this oversight might eventually interfere with the design and execution of his studies. His departure has been called a significant setback for nutrition research by outside health experts.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41380-025-02960-y" target="_blank" rel="noopener">Striatal dopamine tone is positively associated with adiposity in humans as determined by PET using dual dopamine type-2 receptor antagonist tracers</a>,” was authored by Valerie L. Darcey, Juen Guo, Meible Chi, Stephanie T. Chung, Amber B. Courville, Isabelle Gallagher, Peter Herscovitch, Rebecca Howard, Melissa La Noire, Lauren Milley, Alex Schick, Michael Stagliano, Sara Turner, Nicholas Urbanski, Shanna Yang, Eunha Yim, Nan Zhai, Megan S. Zhou, and Kevin D. Hall.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>