<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/shyness-linked-to-spontaneous-activity-in-the-brains-cerebellum/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Shyness linked to spontaneous activity in the brain’s cerebellum</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 6th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study provides new evidence on the neural basis of shyness, suggesting a link between this personality trait and spontaneous activity in the cerebellum. The research indicates that the strength of this relationship is partly explained by an individual’s sensitivity to potential social threats. The findings were published in the journal <em><a href="https://doi.org/10.1016/j.paid.2025.113454" target="_blank">Personality and Individual Differences</a></em>.</p>
<p>Previous research has explored connections between shyness and brain regions involved in emotion and social processing, such as the prefrontal cortex and the amygdala. However, findings have been inconsistent, leaving the specific neural architecture of shyness unclear.</p>
<p>One prominent model suggests shyness emerges from a conflict between the motivation to approach social situations and the motivation to avoid them. To investigate this, researchers often use the concepts of the Behavioral Inhibition System (BIS) and the Behavioral Activation System (BAS). </p>
<p>The BIS is associated with avoidance motivation, making individuals more sensitive to potential punishment or negative outcomes, while the BAS is tied to approach motivation and sensitivity to rewards. The present study aimed to connect these motivational systems to the spontaneous, or resting-state, brain activity associated with shyness.</p>
<p>“Shyness is a common personality trait, but its neural basis has remained elusive. Most existing research has focused on the prefrontal cortex and amygdala, while the role of the cerebellum—traditionally viewed as a ‘motor’ region—has been largely overlooked,” said study author Hong Li, a psychology professor at South China Normal University.</p>
<p>“Yet recent evidence shows that the cerebellum also contributes to emotion and social processing. We wanted to understand whether the cerebellum plays a meaningful role in shyness and how motivational systems—especially the Behavioral Inhibition System (BIS), which governs our sensitivity to threat—might link brain activity to shy behavior. This question bridges an important gap between biological mechanisms and everyday emotional experience.”</p>
<p>The researchers recruited 42 healthy university students. Participants completed questionnaires to measure their levels of trait shyness. They also filled out surveys to assess the sensitivity of their Behavioral Inhibition System and Behavioral Activation System. For example, a high BIS score might reflect agreement with a statement like, “If I think something unpleasant is going to happen, I usually get pretty ‘worked up.’”</p>
<p>Each participant also underwent a resting-state functional magnetic resonance imaging (fMRI) scan. This technique measures brain activity while a person is at rest and not performing any specific task, allowing scientists to observe the brain’s baseline or spontaneous neural patterns. The researchers then analyzed the fMRI data using a method called Regional Homogeneity, or ReHo. This technique measures the degree of synchronized activity among neighboring points in the brain, essentially gauging the local functional harmony within a specific area.</p>
<p>The analysis first looked for direct correlations between shyness scores and ReHo values across the entire brain. The results pointed to a significant association in one particular area: the right posterior lobe of the cerebellum. Specifically, individuals who reported higher levels of shyness tended to have lower ReHo values in this region. This suggests that greater shyness is associated with less synchronized local neural activity in this part of the cerebellum when the brain is at rest.</p>
<p>No other brain regions showed a significant relationship with shyness in this analysis.</p>
<p>“We initially expected the prefrontal cortex to play a stronger role, given previous findings,” Li told PsyPost. “Instead, the cerebellum showed a clear and specific association with shyness. This was surprising and exciting—it suggests that the cerebellum contributes not only to coordination and timing, but also to the fine-tuning of emotional and social responses.”</p>
<p>The researchers also examined the relationships between the personality measures. They found that shyness scores were strongly and positively correlated with scores on the Behavioral Inhibition System. This aligns with the idea that shy individuals tend to be more sensitive to potential threats and social punishments. In contrast, there was no significant correlation between shyness and the Behavioral Activation System, which relates to reward-seeking.</p>
<p>With these connections established, the team performed a mediation analysis to see if the BIS or BAS could explain the link between cerebellar activity and shyness. This statistical method examines whether one factor helps explain the relationship between two others.</p>
<p>The analysis revealed that the Behavioral Inhibition System did indeed play a mediating role. The data suggest that lower synchronized activity in the right posterior cerebellum is associated with a more sensitive behavioral inhibition system, which in turn is linked to higher levels of shyness. The BIS appears to function as a partial bridge connecting the neural pattern to the personality trait.</p>
<p>The Behavioral Activation System, on the other hand, did not show any significant mediating effect. This result provides evidence that shyness may be more strongly driven by avoidance and inhibition motivations than by a lack of approach or reward-seeking motivations. The findings refine the motivational conflict model of shyness, pointing to the primary influence of the brain’s threat-detection system.</p>
<p>“Our results show that people who are more shy tend to have lower spontaneous neural activity in a specific part of the cerebellum (the right posterior lobe),” Li explained. “This relationship is partly explained by higher activity in the Behavioral Inhibition System, which makes people more cautious or anxious in social situations.”</p>
<p>“In simpler terms, shyness may not just come from “overthinking” or lack of confidence—it might also reflect how certain brain systems regulate our sensitivity to potential social threat. This understanding can help us view shyness not as a flaw, but as a meaningful difference in how the brain balances safety and connection.”</p>
<p>The study does have some limitations to consider. The sample size was relatively modest and consisted only of university students, which may limit how broadly the findings can be applied to the general population. The study’s cross-sectional design identifies associations between brain activity and personality, but it cannot establish a direct causal relationship. It remains unclear whether the brain patterns contribute to shyness or if experiences related to shyness shape the brain over time.</p>
<p>“It’s important not to interpret these findings as showing that shyness is ’caused’ by a single brain region,” Li noted. “The cerebellum does not make someone shy by itself. Rather, shyness arises from complex interactions among brain systems, personality, and experience. Our data are correlational, so we can’t infer direct causality—but they point to a promising direction for future longitudinal and experimental research.”</p>
<p>“While the effect sizes in our mediation model are moderate—indicating a partial but meaningful role for the BIS in linking cerebellar activity to shyness—readers should view them as foundational rather than definitive, given our exploratory approach and sample size. Practically, this suggests that targeting the BIS through therapies could have tangible benefits for reducing shyness, though the effects might vary across individuals; it’s not a ‘cure-all’ but a stepping stone toward personalized interventions that could improve social functioning in everyday contexts like work or relationships.”</p>
<p>The researchers also noted that resting-state fMRI captures only one aspect of brain function. Incorporating task-based fMRI, where participants engage in social tasks during the scan, could provide a more complete picture of the neural processes at play. </p>
<p>“We are currently planning to explore how training or modulation of the cerebellum and BIS-related circuits might reduce excessive social inhibition,” Li explained. “For example, neurofeedback and real-time fMRI could be used to help individuals gain more control over their behavioral inhibition responses. We also aim to examine different subtypes of shyness—such as ‘positive shyness’ and ‘fearful shyness’—to see whether they involve distinct neural patterns.”</p>
<p>“I hope this study encourages people to think about shyness with greater compassion. Being shy does not mean being socially deficient—it often reflects a heightened sensitivity to social cues and a desire to interact carefully and meaningfully. Understanding the brain basis of shyness helps us appreciate it as a form of emotional intelligence, rather than simply a barrier to overcome.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.paid.2025.113454" target="_blank">Associations between trait shyness and cerebellar spontaneous neural activity are mediated by behavioral inhibition</a>,” was authored by Liang Li, Yujie Zhang, Benjamin Becker, and Hong Li.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-pinpoint-genetic-markers-that-signal-higher-alzheimers-risk/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists pinpoint genetic markers that signal higher Alzheimer’s risk</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 6th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study has uncovered evidence suggesting that a person’s inherited predisposition for higher levels of the tau protein in their blood is associated with an increased likelihood of developing Alzheimer’s disease or its precursor stage. The findings, which also point to potential differences in risk based on sex and age, were published in the journal <em><a href="https://doi.org/10.1212/WNL.0000000000213904" target="_blank">Neurology</a></em>.</p>
<p>Alzheimer’s disease is a progressive brain disorder that gradually impairs memory and thinking skills. At the molecular level, it is characterized by the accumulation of two key proteins in the brain: amyloid-beta, which forms plaques between nerve cells, and tau, which forms tangles inside them. While tau protein normally helps stabilize the internal skeleton of brain cells, in Alzheimer’s disease it becomes abnormal and aggregates, disrupting cell function and contributing to neurodegeneration.</p>
<p>Because elevated tau levels in the blood can reflect ongoing damage to brain cells, they are considered an important biomarker for the disease. In the new study, led by geneticist <a href="https://scholar.google.gr/citations?user=gW5UcrsAAAAJ&hl=en" target="_blank">Niki Mourtzi</a> and neurology professor <a href="https://scholar.google.com/citations?user=UdGc8RUAAAAJ&hl=en" target="_blank">Nikolaos Scarmeas</a> of the National and Kapodistrian University of Athens Medical School, researchers sought to move beyond measuring current tau levels and instead investigate the underlying genetic factors. </p>
<p>“Early detection of Alzheimer’s disease remains challenging, as most biomarkers require invasive procedures or expensive imaging. We aimed to fill this gap by investigating<br>
whether a polygenic risk score for plasma tau, a minimally invasive biomarker, could identify individuals at higher risk for developing Alzheimer’s disease or amnestic mild cognitive impairment,” the researchers told PsyPost.</p>
<p>A polygenic risk score is a tool that estimates an individual’s inherited susceptibility to a specific condition. This single numerical value is calculated by combining the small effects of numerous genetic variants from across a person’s entire genome.</p>
<p>“An important advantage of a polygenic risk score is that it captures inherited genetic variation, allowing us to predict disease risk from birth, decades before amyloid and tau start to accumulate in the brain,” Mourtzi and Scarmeas explained. “Unlike prior studies that focused on cognitive scores, our study evaluated a clinically meaningful outcome over time, providing a more direct link between genetic risk and disease development.</p>
<p>The investigation was conducted in two main phases, using data from two distinct populations. The first part of the study involved <a href="https://doi.org/10.1159/000362723" target="_blank">the Hellenic Longitudinal Investigation of Aging and Diet (HELIAD)</a>, a community-based study in Greece. The researchers analyzed data from 618 participants, who were 65 years or older and did not have Alzheimer’s or amnestic mild cognitive impairment at the beginning of the study. Amnestic mild cognitive impairment is a condition involving memory loss that is often a precursor to Alzheimer’s disease.</p>
<p>For each participant, the team calculated a polygenic risk score for tau based on 21 genetic variations located near the gene that provides the instructions for making the tau protein. The participants were followed for an average of about three years. During this period, 73 individuals were diagnosed with either Alzheimer’s disease or amnestic mild cognitive impairment.</p>
<p>The analysis provided evidence of an association between the genetic score and disease risk. The results showed that for every one standard deviation increase in the polygenic risk score, there was an associated 29% higher risk of developing one of the cognitive conditions. This relationship appeared to be independent of other known risk factors, including age, sex, education, and the presence of the APOE e4 gene, which is the most well-established genetic risk factor for Alzheimer’s.</p>
<p>“Our study is among the first to link a polygenic risk score for plasma tau directly to clinical outcomes rather than cognitive scores,” Mourtzi and Scarmeas said.</p>
<p>When the researchers examined specific subgroups, they observed that the association was not uniform. The link between a higher genetic score and disease risk was stronger in women, who showed a 45% increase in risk for every standard deviation increase in their score. The association also tended to be more pronounced in younger participants (those below the group’s median age of 73), who had an 87% higher risk. In contrast, the associations were not statistically significant for men or for the older participants in the cohort.</p>
<p>“People with a higher genetic predisposition to elevated plasma tau levels face an increased risk of Alzheimer’s disease or its prodromal stage,” Mourtzi and Scarmeas told PsyPost. “In the HELIAD study, those with higher genetic risk had about a 28.5% greater chance of developing Alzheimer’s disease or amnestic mild cognitive impairment. The effect was stronger in women and younger individuals, suggesting that both sex and age influence how genetic risk translates into disease. Early identification of those at higher genetic risk could enable earlier interventions, lifestyle modifications, or monitoring, potentially improving outcomes.”</p>
<p>“We were somewhat surprised by the pronounced sex- and age-specific effects. This may be influenced by sex-specific genetic mechanisms: for example, X-linked genes such as USP11 are more highly expressed in female brains and can promote tau accumulation, while other X chromosome loci like CHST7 may facilitate tau fibril formation and propagation. We also found that genetic risk was more relevant in younger participants, suggesting that inherited tau- related risk is more influential earlier in life before lifestyle, comorbidities, or other environmental factors become dominant.”</p>
<p>To see if these findings were robust, the researchers sought to replicate them in a much larger and more diverse group of people from the UK Biobank. This second part of the analysis included over 142,000 individuals aged 60 and older who were free of dementia at the start of the study. These participants were followed for an average of nearly 13 years, during which 2,737 developed Alzheimer’s disease.</p>
<p>In this large cohort, a higher polygenic risk score for tau was also associated with an increased risk of an Alzheimer’s diagnosis, which supports the initial findings. The effect size was smaller, with a one standard deviation increase in the score corresponding to about a 5% increase in risk. </p>
<p>The subgroup analyses by sex and age did not produce significant results in this larger sample. However, when the researchers created a smaller UK Biobank subsample that was statistically matched to the Greek cohort based on age, sex, and other characteristics, the results were more aligned. In this matched group, a higher score was linked to a 50% increased risk of developing Alzheimer’s.</p>
<p>“Although the individual effect of the tau PRS is modest, it remained consistent across two large, independent cohorts, reinforcing its potential utility,” Mourtzi and Scarmeas said. “When combined with established risk factors such as APOE genotype, age, sex, or genetic risk for other Alzheimer’s-related biomarkers (e.g., amyloid, hippocampal atrophy, white matter hyperintensities) can help identify people who may be at higher risk for Alzheimer’s disease, potentially years or even decades before symptoms appear.”</p>
<p>It is important to note that polygenic risk scores are predictive tools, not diagnostic certainties. They are based on common genetic variants and do not account for the influence of rare genes, lifestyle choices, or environmental factors, all of which play a part in the development of complex diseases like Alzheimer’s. The score was also developed and tested in populations of European ancestry, meaning its predictive power might not be the same in individuals from other backgrounds.</p>
<p>“It represents only one factor, as other variables like lifestyle, environment, and chance also play a significant role,” the researchers noted. “A high polygenic risk score does not guarantee a person will develop Alzheimer’s disease, and a low polygenic risk score does not exclude the possibility of developing it. However, polygenic risk scores can be seen as an important tool to identify individuals at higher risk and take early preventive actions, such as lifestyle modifications, monitoring, or participation in clinical studies aimed at reducing risk.”</p>
<p>Looking ahead, the research team suggests that this polygenic risk score for tau could be combined with other genetic scores, such as those for amyloid buildup or brain atrophy, to create a more comprehensive risk assessment model. Such a multifaceted approach could improve the ability to stratify individuals by their overall genetic risk, helping to target preventive strategies and guide enrollment in clinical trials for new therapies.</p>
<p>“We aim to integrate tau-related polygenic risk scores with additional genetic and imaging biomarkers to develop comprehensive, multifactorial models for Alzheimer’s disease risk prediction,” Mourtzi and Scarmeas explained. “We have already computed polygenic risk scores for other relevant endophenotypes, including <a href="https://doi.org/10.1002/alz.12980" target="_blank">amyloid deposition</a>, <a href="https://doi.org/10.3390/geriatrics10010014" target="_blank">hippocampal atrophy</a>, and <a href="https://doi.org/10.3390/cimb46010060" target="_blank">white matter hyperintensities</a>, and intend to combine these scores into a single composite measure that captures overall genetic and neuroimaging risk. This integrative approach has the potential to enable early, personalized interventions and to refine risk stratification strategies in both research and clinical settings.”</p>
<p>The study, “<a href="https://doi.org/10.1212/WNL.0000000000213904" target="_blank">Longitudinal Association of a Polygenic Risk Score for Plasma T-Tau With Incident Alzheimer Dementia and Mild Cognitive Impairment</a>,” was authored by Niki Mourtzi, Sokratis Charisis, Eva Ntanasi, Alexandros Hatzimanolis, Alfredo Ramirez, Stefanos N. Sampatakakis, Mary Yannakoulia, Mary H. Kosmidis, Efthimios Dardiotis, George Hadjigeorgiou, Paraskevi Sakka, Eirini Mamalaki, Christopher Papandreou, Marios K. Georgakis, and Nikolaos Scarmeas.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-particular-taste-may-directly-signal-the-brain-to-wake-up/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A particular taste may directly signal the brain to wake up</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 5th 2025, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study reveals that the astringent sensation from flavanols, compounds found in foods like cocoa and berries, can directly stimulate brain activity and enhance memory in mice. The research, published in <em><a href="https://doi.org/10.1016/j.crfs.2025.101195" target="_blank">Current Research in Food Science</a></em>, suggests this sensory signal functions as a mild stressor that triggers a state of heightened alertness and improved cognitive function without the compounds needing to be absorbed into the bloodstream.</p>
<p>Flavanols are a group of natural compounds abundant in plant-based foods such as cocoa, red wine, tea, and various berries. Past research has associated their consumption with several health benefits, including improved memory and cognitive skills. A significant puzzle, however, has been their poor bioavailability, which means only a very small fraction of ingested flavanols actually enters the circulatory system. This left a gap in understanding how these compounds could exert such noticeable effects on the brain.</p>
<p>To investigate this question, a research team led by Yasuyuki Fujii and Professor Naomi Osakabe from the Shibaura Institute of Technology in Japan proposed a different mechanism. They hypothesized that the physical sensation of astringency, the dry, puckering feeling flavanols cause in the mouth, might itself be the trigger. </p>
<p>“We hypothesized that this taste serves as a stimulus, transmitting signals directly to the central nervous system,” explained Fujii, suggesting that sensory nerves could be activating the brain and producing physiological responses.</p>
<p>The researchers conducted a series of experiments on mice to test this idea. In the first part of the study, they orally administered a flavanol solution to one group of mice, while a control group received only distilled water. They then observed the animals’ spontaneous behavior in an open arena. The mice that received flavanols traveled a greater distance, spent more time exploring the center of the arena, and exhibited more behaviors associated with wakefulness, like grooming and rearing, compared to the control group.</p>
<p>To assess the impact on memory, the team used a novel object recognition test. Mice were first allowed to familiarize themselves with two identical objects in an enclosure. Later, one object was replaced with a new, unfamiliar one. The researchers found that mice given flavanols spent substantially more time exploring the new object. This preference for novelty is a standard indicator of recognition memory, suggesting that the flavanol administration had enhanced the mice’s ability to learn and remember.</p>
<p>The team then looked for physiological signs of a stress response, which can be linked to heightened alertness. They collected urine from the mice over a 24-hour period and measured the levels of catecholamines, a class of hormones that includes adrenaline and noradrenaline. The results showed that mice given a higher dose of flavanols had significantly elevated levels of these hormones in their urine. This indicated an activation of the sympathetic nervous system, the network responsible for the body’s “fight-or-flight” response.</p>
<p>Probing deeper into the brain, the scientists examined the hypothalamus, a region central to regulating the body’s stress response. Using a technique that visualizes gene activity in tissue slices, they looked for markers of neural activation. Thirty minutes after the mice received flavanols, there was a significant increase in the activity of a gene that produces corticotropin-releasing hormone, a key initiator of the body’s stress cascade. This finding provided direct evidence that flavanol intake was stimulating a stress-response pathway in the brain.</p>
<p>The most direct evidence for the study’s hypothesis came from a technique called mass spectrometry imaging, which allowed the researchers to create a map of specific chemicals within the brain. </p>
<p>Immediately after the mice consumed the flavanols, the images showed a sharp increase in the neurotransmitter noradrenaline within a small but important brainstem region called the locus coeruleus. This area acts as a primary source of noradrenaline for much of the brain and plays a major role in regulating arousal, attention, and memory. Levels of related chemicals, including dopamine, were also elevated in other brain areas.</p>
<p>In a final experiment, the researchers analyzed gene expression for the enzymes responsible for producing these neurotransmitters. They found that immediately after flavanol administration, the genetic instructions for building noradrenaline and dopamine synthesis machinery were more active in the locus coeruleus. This suggests the brain was not only releasing these chemicals but also ramping up its capacity to produce more of them in response to the astringent stimulus.</p>
<p>Taken together, the results paint a picture of how astringency can affect the brain. The sensation appears to act as a mild, beneficial stressor, similar to physical exercise. This sensory input activates the locus coeruleus, which then bathes the brain in noradrenaline, increasing alertness, sharpening attention, and improving memory consolidation. </p>
<p>“Stress responses elicited by flavanols in this study are similar to those elicited by physical exercise,” remarked Fujii. He adds that moderate intake of these compounds, despite their low absorption, “can improve the health and quality of life.”</p>
<p>This study was conducted in mice, and further research is needed to confirm if the same mechanisms apply to humans. The precise way in which the digestive tract senses astringency and transmits that signal to the brain also remains an area for future investigation. </p>
<p>The researchers suggest that specific sensory receptors may be involved in detecting the chemical properties of flavanols, initiating the entire neural cascade. Understanding these sensory-to-brain pathways could open new avenues for developing foods designed to support cognitive health through their sensory properties.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.crfs.2025.101195" target="_blank">Astringent flavanol fires the locus-noradrenergic system, regulating neurobehavior and autonomic nerves</a>,” was authored by Yasuyuki Fujii, Shu Taira, Keisuke Shinoda, Yuki Yamato, Kazuki Sakata, Orie Muta, Yuta Osada, Ashiyu Ono, Toshiya Matsushita, Mizuki Azumi, Hitomi Shikano, Keiko Abe, Vittorio Calabrese, and Naomi Osakabe.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/covid-19-exposure-during-pregnancy-may-increase-childs-autism-risk/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">COVID-19 exposure during pregnancy may increase child’s autism risk</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 5th 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Children whose mothers had COVID-19 during pregnancy appear to have an increased likelihood of being diagnosed with developmental conditions by age three, including speech delays and autism. This new research, published in <em><a href="https://journals.lww.com/greenjournal/abstract/9900/neurodevelopmental_outcomes_of_3_year_old_children.1392.aspx" target="_blank">Obstetrics & Gynecology</a></em>, suggests that maternal COVID-19 infection may influence fetal brain development.</p>
<p>The study provides evidence that exposure to the SARS-CoV-2 virus in the womb may be associated with neurodevelopmental differences in early childhood. This association appears more pronounced in male children and when the infection occurred in the third trimester of pregnancy.</p>
<p>COVID-19 is a respiratory illness caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The pandemic that began in early 2020 raised many questions about the virus’s impact on various aspects of health, including pregnancy and child development. </p>
<p>Previous research on other maternal infections during pregnancy has indicated a potential link to various neurodevelopmental conditions in children. For instance, studies have shown that immune system activation in a pregnant individual can disrupt the developing brain of the fetus and affect offspring behavior in animal models.</p>
<p>Researchers at Mass General Brigham conducted this study to examine whether SARS-CoV-2 infection during pregnancy could be associated with similar outcomes. They had previously observed an elevated risk of neurodevelopmental diagnoses at 12 and 18 months in children exposed to maternal SARS-CoV-2 infection during pregnancy. The current study aimed to determine if these potential effects persisted into early childhood, specifically looking at diagnoses by age three.</p>
<p>The researchers analyzed data from 18,124 live births that occurred within the Mass General Brigham health system between March 2020 and May 2021. This period was selected because it featured universal SARS-CoV-2 testing in labor and delivery units and widespread screening for COVID-19 symptoms during pregnancy, which helped ensure reliable identification of both positive and negative cases. The team linked data from mothers and their children, examining maternal medical history, vaccination status, and sociodemographic information.</p>
<p>The main factor of interest was a positive SARS-CoV-2 PCR test result during pregnancy. The primary outcome the researchers looked for was at least one neurodevelopmental diagnosis within the first three years after birth, identified through specific diagnostic codes from medical records. These codes covered a range of conditions, including disorders of speech and language, motor function, and autism spectrum disorder. They also considered potential influencing factors such as maternal age, race, ethnicity, insurance type, and whether the birth was preterm.</p>
<p>Among the 18,124 live births included in the study, 861 children were exposed to maternal SARS-CoV-2 infection during pregnancy. The researchers found that 140 of these 861 children (16.3%) received a neurodevelopmental diagnosis by age three. </p>
<p>In comparison, among the 17,263 children whose mothers did not have SARS-CoV-2 infection during pregnancy, 1,680 (9.7%) received such a diagnosis. After accounting for other factors that could influence neurodevelopment, maternal SARS-CoV-2 infection during pregnancy was associated with a 29% higher odds of a child receiving a neurodevelopmental diagnosis by age three.</p>
<p>“These findings highlight that COVID-19, like many other infections in pregnancy, may pose risks not only to the mother, but to fetal brain development,” said senior author Andrea Edlow, MD MSc, a Maternal-Fetal Medicine specialist in the Department of Obstetrics and Gynecology at Mass General Brigham.</p>
<p>The study also investigated specific patterns within these findings. The association between maternal SARS-CoV-2 infection and neurodevelopmental diagnoses was found to be more pronounced when the infection occurred during the third trimester of pregnancy. Children exposed during the third trimester had a significantly increased risk of a neurodevelopmental diagnosis compared to the unexposed group. However, exposure during the first or second trimesters did not show a statistically significant difference in risk from the unexposed group.</p>
<p>Additionally, the researchers observed a difference in risk between male and female offspring. Third-trimester maternal SARS-CoV-2 infection was significantly associated with an increased risk of neurodevelopmental diagnosis in male children. </p>
<p>The magnitude of risk in female offspring with third-trimester exposure was smaller and did not reach statistical significance in this study. The most frequently identified neurodevelopmental diagnoses included disorders of speech and language, developmental disorder of motor function, autistic disorder, and other specified or unspecified disorders of psychological development.</p>
<p>While reducing risk is important, co-senior author Roy Perlis of the Mass General Brigham Department of Psychiatry noted, that the “overall risk of adverse neurodevelopmental outcomes in exposed children likely remains low.”</p>
<p>While these findings suggest an association, there are some aspects to consider. The study relied on medical record diagnoses, which may not capture all neurodevelopmental conditions and could potentially lead to some misclassification. However, such misclassification would likely lead to results that underestimate the true effect. Children who received diagnoses outside the Mass General Brigham health system would not have been included in the dataset. Also, asymptomatic SARS-CoV-2 infections during pregnancy might not have been consistently detected, which could also lead to an underestimation of the effects.</p>
<p>Future research could involve continued follow-up of these children to assess the long-term persistence and clinical impact of these early neurodevelopmental observations. Further studies may also explore the underlying biological mechanisms in greater detail.</p>
<p>First author and Maternal-Fetal Medicine specialist Lydia Shook added: :Parental awareness of the potential for adverse child neurodevelopmental outcomes after COVID-19 in pregnancy is key. By understanding the risks, parents can appropriately advocate for their children to have proper evaluation and support.:</p>
<p>The study, “<a href="https://journals.lww.com/greenjournal/abstract/9900/neurodevelopmental_outcomes_of_3_year_old_children.1392.aspx" target="_blank">Neurodevelopmental Outcomes of 3-Year-Old Children Exposed to Maternal Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) Infection in Utero</a>,” was authored by Lydia L. Shook, Victor Castro, Laura Ibanez-Pintor, Roy H. Perlis, and Andrea G. Edlow.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/life-purpose-linked-to-28-lower-risk-of-cognitive-decline/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Life purpose linked to 28% lower risk of cognitive decline</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 5th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>An analysis of the Health and Retirement Study data found that individuals who had a stronger sense of purpose in life at the start of the study were less likely to develop cognitive impairment during the follow-up period, which was up to 15 years. The paper was published in the <a href="https://doi.org/10.1016/j.jagp.2025.05.009"><em>American Journal of Geriatric Psychiatry</em></a>.</p>
<p>As people age, they may experience a gradual decline in mental abilities such as memory, attention, and processing speed. This decline is a natural part of aging, but the pace and extent of this change vary greatly among individuals. Studies have identified many factors that may accelerate cognitive decline, including stress, sedentary behavior, poor diet, and limited social engagement.</p>
<p>This issue is becoming more important today because people are living longer than ever before. As life expectancy rises, the proportion of older adults in the population grows, leading to more people experiencing age-related cognitive changes. Consequently, the topic of preventing or delaying cognitive decline is attracting significant interest from researchers.</p>
<p>Study authors Nicholas C. Howard and his colleagues wanted to explore the association between purpose in life and the risk of developing mild cognitive impairment or dementia. Purpose in life is defined as a person’s tendency to derive meaning from and make sense of life experiences.</p>
<p>Previous studies have already reported that purpose in life is associated with a lower risk of Alzheimer’s disease (a type of dementia) and mild cognitive impairment. However, most of those studies focused on individuals aged 70 years or older or were conducted on smaller groups of participants. The authors of this new study wanted to verify this finding in a large, diverse, US population-based cohort.</p>
<p>They analyzed data from the Health and Retirement Study, focusing on individuals assessed between 2006 and 2020. The Health and Retirement Study is sponsored by the National Institute on Aging and is conducted by the University of Michigan.</p>
<p>These researchers analyzed data from 13,765 individuals who were at least 45 years old and had normal cognitive performance at the start of the study. They used data on participants’ purpose in life taken at baseline, whether they developed cognitive impairment during the follow-up period, APOE genotyping, and various other demographic and psychological characteristics.</p>
<p>APOE genotyping is a genetic test that identifies which version of the apolipoprotein E (APOE) gene a person carries. This was used to assess the risk of Alzheimer’s disease, as it is known that having the APOE E4 variant of this gene is associated with an increased risk, particularly if a person inherits this variant from both parents (i.e., if they are homozygous for this gene variant).</p>
<p>Results showed that 13% of participants developed cognitive impairment during the follow-up period, which lasted up to 15 years (the median was 8 years). Participants with a stronger purpose in life at the start of the study were less likely to develop cognitive impairment compared to their peers with a weaker purpose in life. </p>
<p>After taking into account sex, baseline age, depression, education level, and race/ethnicity, participants with the strongest sense of purpose in life still had an approximately 28% lower risk. The association remained significant even after accounting for whether participants carried the APOE E4 gene variant.</p>
<p>“Higher PiL [purpose in life] was associated with approximately 28% lower risk for developing cognitive impairment and a later onset of cognitive impairment across the studied ethnic/racial groups, even among those with genetic risk for dementia. These findings indicate that fostering a sense of life purpose has the potential to reduce cognitive impairment and dementia risk,” the study authors concluded.</p>
<p>The study contributes to the scientific understanding of factors affecting cognitive decline. However, it should be noted that the design of this study does not allow for definitive causal inferences. While it is possible that a stronger purpose in life protects against cognitive decline, it is also possible that early, undetected brain changes lead to a reduced sense of purpose (reverse causality), or that another unmeasured factor leads to both a stronger purpose in life and protection against cognitive decline.</p>
<p>The paper “<a href="https://doi.org/10.1016/j.jagp.2025.05.009">Life Purpose Lowers Risk for Cognitive Impairment in a United States Population-Based Cohort</a>” was authored by Nicholas C. Howard, Ekaterina S. Gerasimov, Thomas S. Wingo, and Aliza P. Wingo.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/disgust-sensitivity-is-linked-to-a-sexual-double-standard-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Disgust sensitivity is linked to a sexual double standard, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 5th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study provides evidence that negative attitudes toward sexually expressive people may apply to adults of all ages, rather than being a bias specifically aimed at older adults. The research also suggests that an individual’s sensitivity to disgust can influence these judgments differently depending on whether they are evaluating a man or a woman. The findings were published in <a href="https://doi.org/10.1080/00224499.2025.2565806" target="_blank">The Journal of Sex Research</a>.</p>
<p>The motivation for this research stems from the rapidly aging population in the United States and the need to better understand the sexual and relationship needs of older adults. Previous research has shown that negative attitudes about the sexuality of older people, a phenomenon known as sexual ageism, can be a barrier to their well-being. However, the existing body of research on this topic has produced mixed results.</p>
<p>Some studies indicate that older adults are often stereotyped as asexual or are viewed negatively when they do express sexuality. Other studies suggest that people hold neutral or even positive views. The researchers behind this new work noted that much of the prior research lacked important comparison groups. </p>
<p>Without comparing judgments of older adults to judgments of younger adults, or judgments of sexual behavior to non-sexual behavior, it is difficult to determine if negative reactions are due to a person’s age or simply due to their sexual expression. This study was designed to disentangle these possibilities.</p>
<p>“Older adults frequently report that others treat them as asexual or dismiss their sexuality, and we wanted to design a study that would test whether older adults faced stigma for their sexual expression, but also whether this sexuality-based stigma was present for younger adults as well,” said study author Gabriella Rose Petruzzello, a PhD student at the University of New Brunswick and member of <a href="https://www.sexmeetsrelationships.com/" target="_blank">the Sex Meets Relationships Research Lab</a>.</p>
<p>“Simultaneously, a bunch of research has shown that people who have higher levels of the emotion of disgust tend to report more homophobia, transphobia, and more negative attitudes towards individuals who violate sexual norms. We live in a sex-saturated world, but one where a sizable portion of individuals, knowingly or unknowingly, continue to stigmatize certain forms of sexual expression. This research sought to identify this sexual stigma and determine whether an individual’s likelihood of experiencing disgust was one factor that could predict this stigmatization.”</p>
<p>The research consisted of two separate experiments. In the first study, 303 participants were recruited online and randomly assigned to read one of four different informational flyers. The flyers were designed to introduce a new neighbor, a woman named Elizabeth, who was either 25 or 65 years old. The content of the flyer was also varied; it described Elizabeth as having either a vibrant “romantic life” or a vibrant “sex life” with her husband.</p>
<p>After reading one of the four versions of the flyer, participants rated Elizabeth on several scales. These scales were used to assess their general interpersonal evaluations of her, such as viewing her as good or bad, and their perceptions of her lifestyle, such as viewing it as safe or risky. Participants also completed a questionnaire to measure their own level of disgust sensitivity, a personality trait related to how easily a person feels disgust in response to various situations.</p>
<p>The results of the first study did not show strong evidence for ageism. The 65-year-old woman was not evaluated more negatively or as being riskier than the 25-year-old woman. A clear pattern did emerge regarding sexual expression. The woman described as having a vibrant sex life was rated more negatively and as riskier than the woman described as having a vibrant romantic life, and this was true for both the younger and older targets.</p>
<p>The researchers also found a connection with the participants’ own disgust sensitivity. Individuals who were more easily disgusted tended to judge the sexually open women more harshly, viewing them more negatively and as being riskier. This relationship was not present when they evaluated the women described in romantic terms. This suggests that for women, being openly sexual invites more negative scrutiny from people who are high in disgust sensitivity.</p>
<p>The second study followed a nearly identical design to investigate whether these patterns would hold true for male targets. A new group of 375 participants read one of four flyers introducing a man who was either 25 or 65 years old and was described as having either a vibrant romantic or sex life. Participants then provided the same types of ratings as in the first study.</p>
<p>Consistent with the first study, the results provided evidence for a general sexual stigma. Men described as being sexual were rated more negatively and as riskier than men described as being romantic, regardless of their age. The findings on age were slightly different. The younger men tended to be perceived slightly more negatively and as riskier than the older men, a finding that runs contrary to the concept of ageism against older adults.</p>
<p>A notable difference emerged in the role of disgust sensitivity. When evaluating the male targets, participants higher in disgust sensitivity tended to rate the sexually open men more positively. This is the opposite of the pattern observed in the first study, where higher disgust sensitivity was linked to more negative evaluations of sexually open women. Disgust sensitivity was not found to be related to how risky the men were perceived to be.</p>
<p>“The key takeaways are 1) sexual stigma appears to transcend age and gender related boundaries with both younger and older men and women being rated as more negative and as riskier for their sexual expression compared to their romantic expression,” Petruzzello told PsyPost. “We were surprised by just how present this stigma appeared to be and how this stigma was present across age and gender groups. 2) Disgust sensitivity was not universally related to negative perceptions of sexuality. Individuals higher in disgust sensitivity appear to penalize sexually open women but actually reward sexually open men. This would point to disgust as a factor reinforcing sexual double standard beliefs, whereby women are penalized for their sexuality and men are rewarded.”</p>
<p>“Collectively, this study offers evidence that sexual stigma is alive and well and that individual difference variables like disgust can perpetuate harmful sexual stigmas against some groups (like women) more so than other groups (like men).”</p>
<p>The researchers acknowledge some limitations of their work. The use of an experimental flyer might not perfectly reflect real-world social interactions, as such a direct disclosure about one’s sex life to a new neighbor could be seen as a violation of social norms. The negative reactions might be partly due to this oversharing, rather than simply the sexual content itself.</p>
<p>Additionally, the flyers in the study only featured White individuals. This means the findings may not be generalizable to people of other racial or ethnic backgrounds, where cultural norms and stereotypes around sexuality and aging might be different. Future research could explore how factors like race and sexual orientation intersect with age and sexual expression to shape people’s perceptions. Future studies could also include male and female targets in a single experiment to more directly compare judgments and confirm the patterns observed.</p>
<p>“Our lab wants to continue understanding how feelings of disgust contribute to sexual stigma and can also undermine individual’s general and sexual well-being,” Petruzzello said. “We’re excited that we were able to add to the body of research showing that disgust sensitivity is related to harmful beliefs about certain groups, which has important implications for future efforts to mitigate the effects of these biases.”</p>
<p>The study, “<a href="https://doi.org/10.1080/00224499.2025.2565806" target="_blank">Sexual Ageism or Sexual Stigma? Sexual Double Standards and Disgust Sensitivity in Judgments of Sexual and Romantic Behavior</a>,” was authored by Gabriella Petruzzello, Lucia F. O’Sullivan, and Randall A. Renstrom.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-review-questions-the-evidence-for-common-depression-treatments/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New review questions the evidence for common depression treatments</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 5th 2025, 13:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new review of depression treatments suggests that the scientific evidence for many common strategies used when a first antidepressant fails is not as strong as widely believed. The findings, which reexamine the influential Sequenced Treatment Alternatives to Relieve Depression (STAR*D) trial, indicate that the benefits observed in that study may stem more from factors like patient expectations than the specific pharmacological action of the medications. The analysis was published in the <em><a href="https://journals.lww.com/psychopharmacology/abstract/2025/11000/what_if_star_d_had_been_placebo_controlled__a.18.aspx" target="_blank" rel="noopener">Journal of Clinical Psychopharmacology</a></em>.</p>
<p>The new review was conducted by a team of researchers led by Kevin P. Kennedy at the Corporal Michael J. Crescenzo VA Medical Center. Their work was prompted by the widespread influence of <a href="https://www.sciencedirect.com/science/article/abs/pii/S0197245603001120">the STAR*D trial</a>.</p>
<p>Before STAR*D, most clinical trials for antidepressants studied medications in carefully selected patient groups. These trials often excluded individuals with other medical conditions, co-occurring psychiatric disorders, or chronic depression, meaning the results were not always applicable to the more complex patients seen in everyday clinical practice.</p>
<p>Published in the early 2000s, STAR*D was a large and ambitious study designed to fill this knowledge gap. As a pragmatic trial, it was conducted in real-world primary care and psychiatric clinics and enrolled a diverse group of over 4,000 patients, making it the largest study of its kind.</p>
<p>All participants began treatment with the antidepressant citalopram. Those who did not achieve remission after this first step could proceed through a sequence of up to three additional treatment levels, where they were offered different strategies, such as switching to another antidepressant or augmenting their current medication with a second one.</p>
<p>The findings from STAR*D have shaped depression treatment for years. The study reported that while only about a third of patients recovered after the first treatment, sequential treatment steps offered continued hope. The most widely cited conclusion was that by trying up to four different strategies, a cumulative remission rate of nearly 70% could be achieved among patients who remained in the study.</p>
<p>However, the original STAR*D study was open-label, meaning both patients and their doctors knew which medication was being prescribed, and there was no placebo group for comparison. This design makes it difficult to separate the true effect of a drug from other factors like a patient’s expectations or the natural course of the illness.</p>
<p>Since STAR*D’s publication, many of its treatment steps have been tested in double-blind, placebo-controlled randomized trials, which are considered a more rigorous way to measure effectiveness. Kennedy and his colleagues set out to compare the results from STAR*D with this newer body of evidence.</p>
<p>The researchers performed a detailed review of the scientific literature, searching for meta-analyses and high-quality randomized controlled trials that investigated the specific treatment strategies used in the different stages, or “levels,” of the STAR*D study.</p>
<p>They then systematically compared the findings from these blinded, controlled studies to the outcomes reported in the original STAR*D trial for each corresponding strategy. These strategies included increasing the dose of an initial antidepressant, switching to a different one, and augmenting an antidepressant with a second medication.</p>
<p>The first strategy examined was the practice of increasing an antidepressant dose if a patient does not respond to the initial starting dose. In STAR*D, patients who did not achieve remission on the antidepressant citalopram had their dose systematically increased.</p>
<p>The review by Kennedy and colleagues found that this common practice is not well supported by subsequent controlled trials. Multiple meta-analyses that pooled data from thousands of patients provided evidence that increasing the dose of a selective serotonin reuptake inhibitor (SSRI), the class of drug that includes citalopram, offers no significant benefit over simply continuing the original, standard dose.</p>
<p>These analyses also suggested that higher doses tend to be associated with a greater likelihood of side effects, potentially making the treatment less tolerable for patients without providing additional antidepressant effects. While a couple of analyses identified a very modest benefit at certain higher doses, the overall body of evidence points toward a flat dose-response relationship for SSRIs, meaning that once a standard therapeutic dose is reached, higher doses do not appear to provide a clinically meaningful improvement.</p>
<p>The next strategy evaluated was switching to a different antidepressant after the first one proved ineffective. This was a core component of Levels 2 and 3 in the STAR*D trial.</p>
<p>The review found a similar lack of supporting evidence from blinded trials for this approach. A meta-analysis of studies in which patients were randomly assigned to either switch to a new antidepressant or continue their original one found no advantage for the switching strategy. In these controlled settings, patients who switched medications did not experience greater symptom reduction than those who stayed on their initial medication.</p>
<p>The researchers also examined the strategy of augmentation, which involves adding a second medication to the first antidepressant. In STAR*D’s Level 2, patients could have their citalopram augmented with either bupropion or buspirone. For buspirone, the review found consistent evidence from blinded trials that it performs no better than a placebo when added to an SSRI. This finding stands in contrast to STAR*D, where buspirone augmentation was associated with remission rates nearly identical to bupropion augmentation.</p>
<p>The evidence for bupropion augmentation was more complex but generally did not replicate STAR*D’s positive results. A comprehensive meta-analysis found that when all trials were considered, adding bupropion was not superior to antidepressant monotherapy. While a small subset of trials involving patients who had previously not responded to treatment showed a marginal benefit, these studies had limitations. The larger, higher-quality trials failed to show a clear advantage for the combination treatment.</p>
<p>The review then moved to the augmentation strategies used in STAR*D’s Level 3, which were reserved for patients who had not responded to two previous treatment attempts. These strategies involved adding either T3 thyroid hormone or lithium. For T3, the available evidence from controlled trials is limited, but existing meta-analyses do not suggest that it outperforms a placebo. Studies looking at both T3 augmentation and co-prescribing it with an antidepressant from the start have not found a significant benefit in remission or response rates.</p>
<p>Lithium augmentation, on the other hand, appeared to be one of the few STAR*D strategies with some support from controlled trials. Meta-analyses of placebo-controlled studies have consistently found that adding lithium to an antidepressant is an effective strategy for treatment-resistant depression. However, the researchers noted an important limitation. The evidence base is surprisingly small, and very few of these trials have specifically studied lithium in combination with the modern SSRIs that are most commonly prescribed today.</p>
<p>Finally, the researchers looked at the Level 4 strategy of combining the antidepressants venlafaxine and mirtazapine for highly treatment-resistant patients. A large meta-analysis provides evidence of a benefit for this type of combination therapy compared to monotherapy. This finding seems to support the strategy used in STAR*D.</p>
<p>Yet, the review authors point to significant limitations within that meta-analysis. They note that the positive result appears to be heavily influenced by many small studies, while the five largest and highest-quality trials on the topic were all negative. This suggests the possibility of publication bias, where smaller studies with positive results are more likely to be published than larger studies with negative results. After accounting for this potential bias, the benefit of the combination was reduced to a level that may not be clinically meaningful.</p>
<p>The authors of the review acknowledge several limitations in their own analysis. The patients included in randomized controlled trials are often different from the “real-world” patients in STAR*D, who had more co-occurring medical and psychiatric conditions. It is possible that treatments that fail in controlled trials could still have an effect in a more diverse population.</p>
<p>Additionally, the specific treatment protocols in the controlled trials did not always perfectly match the steps taken in STAR*D, and the review itself was not a formal systematic one, meaning some relevant studies may have been missed.</p>
<p>The findings from this review have several important implications. They suggest that many treatment guidelines, which were shaped by STAR*D, may be based on strategies whose effectiveness is not confirmed by blinded, placebo-controlled evidence. The discrepancy between STAR*D’s outcomes and the results of controlled trials highlights the powerful role of non-pharmacological factors in treating depression. These factors, such as patient expectancy and the therapeutic relationship, may account for much of the improvement seen in open-label settings.</p>
<p>Future research should focus on conducting high-quality, blinded trials for second- and third-step depression treatments to provide clinicians and patients with clearer guidance. The review also suggests that findings from pragmatic trials should be interpreted with caution until they are validated by more rigorous studies.</p>
<p>The study, “<a href="https://doi.org/10.1097/jcp.0000000000002025" target="_blank" rel="noopener">What if STAR*D Had Been Placebo-Controlled? A Critical Reexamination of a Foundational Study in Depression Treatment</a>,” was authored by Kevin P. Kennedy, Jonathan P. Heldt, and David W. Oslin.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/in-shock-discovery-scientists-link-mothers-childhood-trauma-to-specific-molecules-in-her-breast-milk/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">In shock discovery, scientists link mother’s childhood trauma to specific molecules in her breast milk</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 5th 2025, 10:30</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1038/s41398-025-03491-4" target="_blank">Translational Psychiatry</a></em> reports that mothers with a history of adverse childhood experiences tend to have a distinct molecular profile in their breast milk. These differences in specific microRNAs and fatty acids were also associated with aspects of their infants’ temperament in the first year of life.</p>
<p>A growing body of evidence suggests that the health consequences of early life stress can be transmitted across generations. Adverse childhood experiences, such as abuse, neglect, or household dysfunction, can have lasting effects on an individual’s mental and physical health. Research also indicates that children of parents who were exposed to such adversity are at a higher risk for developing their own behavioral and metabolic issues. </p>
<p>Scientists are working to identify the biological pathways through which these effects might be passed down. Breast milk, a complex fluid rich in bioactive compounds that influence infant development, presents a plausible route for this transmission.</p>
<p>“Our main motivation was to examine the relevance of breast milk to the emerging concept of ‘transgenerational trauma’. Our previous work identified a role for sperm epigenetics in potential biological transmission of psychiatric disease susceptibility through the patriline (fathers),” said study author <a href="http://www.jawaidlab.com" target="_blank">Ali Jawaid</a>, principal investigator at the Translational Neuropsychiatry Research Group (TREND Lab) at the Polish Center for Technology Development.</p>
<p>“Breast milk introduces an additional pathway that is relevant for matrilineal (mothers) influences. We wanted to test whether epigenetic signatures of adverse childhood experiences in mothers could be detected in their breast milk, and whether they are associated with early behavioral measures in their infants. This is, indeed, what the study showed.”</p>
<p>For their study, the researchers conducted a prospective study with 103 mother-child pairs from Wroclaw, Poland. The participants were assessed at birth, and then again at 5 and 12 months after birth. At the 5-month visit, mothers provided breast milk samples and completed questionnaires about their infant’s temperament. </p>
<p>At the 12-month visit, mothers completed a questionnaire to assess their own history of adverse childhood experiences before the age of 12. This timing was chosen to prevent any stress from the questionnaire from influencing the composition of the milk samples.</p>
<p>The research team analyzed the breast milk for two types of molecules: microRNAs, which are small molecules that help regulate which genes are turned on or off, and fatty acids, which are fundamental components of fats. The mothers were categorized into a “high adversity” group if they had experienced two or more traumatic events in childhood, and a “low adversity” group if they had experienced zero or one. The scientists then compared the molecular composition of the milk between these two groups.</p>
<p>The analysis revealed distinct differences in the milk’s microRNA content. Milk from mothers in the high adversity group showed elevated levels of three specific microRNAs, identified as miR-142-3p, miR-142-5p, and miR-223-3p. </p>
<p>Further analysis indicated a positive correlation, suggesting that as a mother’s number of adverse childhood experiences increased, the levels of these three microRNAs in her breast milk also tended to increase. These associations were present even when accounting for symptoms of postpartum depression, which did not differ between the groups.</p>
<p>“We were surprised that the alterations of microRNAs in milk were not mediated or confounded by postpartum depression,” Jawaid told PsyPost. “One might expect that mothers with more adverse childhood experiences would also have higher postpartum depression, and that this could explain the effects observed in milk. However, this was not the case in our cohort.”</p>
<p>The researchers also identified differences in the fatty acid composition of the breast milk. Specifically, mothers in the high adversity group had lower concentrations of medium-chain fatty acids in their milk compared to mothers in the low adversity group. This finding held true even after the researchers statistically controlled for other factors that could influence fatty acid levels, such as the mother’s dietary fat intake and body mass index.</p>
<p>“Signatures of childhood traumatic experiences can persist biologically for a long time and can be detectable even in body fluids such as breast milk,” Jawaid said. “A next step will be to examine whether enriching experiences or therapy before or during pregnancy can modify these signals.”</p>
<p>The researchers then explored potential links between these molecular signatures in the milk and the infants’ temperament, which was assessed through maternal reports. The results suggest a connection. For example, higher expression of miR-142-5p in breast milk was associated with infants showing more high-intensity pleasure at 12 months. At the same time, lower expression of this microRNA was linked to infants showing more distress when faced with limitations.</p>
<p>Similarly, the levels of medium-chain fatty acids in the milk were associated with certain infant behaviors. Higher concentrations of these fatty acids were correlated with a greater “falling reactivity,” which reflects an infant’s reaction to loss of support or balance, at 5 months of age. Other types of fatty acids in the milk were also linked to temperamental traits such as activity level and soothability at 12 months.</p>
<p>“Epigenetic signatures in milk were associated with different early temperaments in newborns,” Jawaid told PsyPost. “However, this <em>should NOT</em> be interpreted as breastfeeding being harmful. Breast milk is protective in many ways. We need more work to clarify whether these epigenetic signals in the milk that are impacted by mothers’ childhood adversity are just biomarkers or transmit risk or adaptability and resilience to the next generation.” </p>
<p>The authors note some limitations of their work. The findings are based on correlations, which means the study identifies associations but cannot prove that the changes in breast milk directly cause the differences in infant temperament. The study was also conducted with a specific group of participants from an urban Polish population, so the results may not apply to all populations.</p>
<p>“The study involved 103 mother child dyads from a highly educated urban Polish cohort, so the implications should be interpreted with nuance,” Jawaid noted. “Still, the findings show that maternal adverse childhood experiences are associated with measurable epigenetic alterations in human breast milk, and that these signatures relate to early infant behavioral profiles.”</p>
<p>The researchers also emphasize that these findings should not be interpreted to discourage breastfeeding, as breast milk provides numerous established benefits for infant health. The study instead points to early life trauma as a public health issue with long-lasting biological consequences, highlighting the need to develop strategies that might mitigate the transmission of risk across generations.</p>
<p>“This study should not be misused to blame mothers or to argue for formula feeding,” Jawaid explained. “Breast milk provides many protective factors, and we cannot say — at this point — that altered epigenetic factors in milk lead to psychiatric disease risk. Importantly, our previous work has also identified trauma related epigenetic changes in sperm. The biological and psychosocial contributions of both mothers and fathers matter. Trauma is the problem, not mothers or fathers.”</p>
<p>For future research, the scientists are planning studies with animal models to better understand the potential causal links between milk components and offspring outcomes. They are also continuing to follow the children from this study as they grow older to see how these early associations relate to later health and behavior. </p>
<p>“Our long-term goal is to identify biomarkers and mechanisms of intergenerational transmission of psychiatric disease risk in humans,” Jawaid said. “We are following this cohort longitudinally, and are studying parallel cohorts in Bosnia, Pakistan, and Rwanda. Ultimately, we hope to develop biomarkers, guidelines and mitigation strategies to prevent the transmission of psychiatric risk across generations.”</p>
<p>The study, “<a href="https://doi.org/10.1038/s41398-025-03491-4" target="_blank">Differential microRNAs and metabolites in the breast milk of mothers with adverse childhood experiences</a>,” was authored by Weronika Tomaszewska, Anna Apanasewicz, Magdalena Gomółka, Maja Matyas, Patrycja Rojek, Marek Szołtysik, Magdalena Babiszewska-Aksamit, Bartlomiej Gielniewski, Bartosz Wojtas, Anna Ziomkiewicz and Ali Jawaid.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/neuroscientists-just-discovered-a-hidden-drainage-system-in-the-human-brain/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Neuroscientists just discovered a hidden drainage system in the human brain</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 5th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A team of researchers has identified a previously unrecognized hub for waste clearance in the human brain, located along a major artery supplying the brain’s protective outer layer. This discovery, published in <em><a href="https://doi.org/10.1016/j.isci.2025.113693" target="_blank">iScience</a></em>, provides an updated map of the brain’s lymphatic drainage system and offers a new framework for understanding how the brain maintains its health.</p>
<p>The brain is encased in a set of protective membranes called the meninges. For a long time, these membranes were thought to isolate the brain from the rest of the body’s immune system. More recent research has overturned this idea by identifying a network of lymphatic vessels within the meninges that drain waste-filled fluid away from the brain. Understanding this drainage system is essential for developing new approaches to neurological and psychiatric conditions.</p>
<p>A research team led by Onder Albayram, an associate professor at the Medical University of South Carolina, has been working to map this intricate network in living humans. Their work aims to establish a clear picture of how these structures function in a healthy state. This knowledge can serve as a baseline to identify changes that occur in conditions such as Alzheimer’s disease, brain injury, or as a consequence of aging.</p>
<p>To investigate these drainage pathways, the researchers conducted a study on five healthy participants. The team used a dynamic contrast-enhanced magnetic resonance imaging (MRI) technique, originally developed through a partnership with NASA to study fluid shifts in astronauts. Each participant received an injection of a contrast agent, and their brains were scanned at several intervals over a six-hour period, allowing the scientists to track the movement of fluid.</p>
<p>The researchers focused on the middle meningeal artery, a significant vessel in the dura mater, which is the outermost layer of the meninges. They measured the signal intensity of the contrast agent both inside the artery and in the tissue immediately surrounding it. The signal inside the artery peaked quickly and then faded, which is the expected pattern for blood flow as the agent is cleared from circulation.</p>
<p>In the tissue surrounding the artery, however, a different pattern emerged. The signal from the contrast agent increased slowly, reaching its peak 90 minutes after injection before gradually declining. This delayed clearance suggests a much slower fluid movement, inconsistent with the rapid dynamics of blood circulation. Albayram explained the observation, stating, “We saw a flow pattern that didn’t behave like blood moving through an artery; it was slower, more like drainage, showing that this vessel is part of the brain’s cleanup system.”</p>
<p>To verify that this slow-moving fluid was flowing through genuine lymphatic vessels, the team needed to examine the anatomy of the region at a microscopic level. They obtained postmortem human dural tissue and used two different advanced imaging techniques to map its cellular architecture. One method, immunofluorescence confocal microscopy, uses fluorescent tags to light up specific proteins. The other, Hyperion Imaging Mass Cytometry, uses metal tags to achieve highly detailed, multi-layered images of different cell types simultaneously.</p>
<figure aria-describedby="caption-attachment-229532" class="wp-caption aligncenter"><img fetchpriority="high" decoding="async" src="https://www.psypost.org/wp-content/uploads/2025/11/fx1_lrg.jpg" alt="" width="996" height="996" class="size-full wp-image-229532" srcset="https://www.psypost.org/wp-content/uploads/2025/11/fx1_lrg.jpg 996w, https://www.psypost.org/wp-content/uploads/2025/11/fx1_lrg-300x300.jpg 300w, https://www.psypost.org/wp-content/uploads/2025/11/fx1_lrg-768x768.jpg 768w, https://www.psypost.org/wp-content/uploads/2025/11/fx1_lrg-75x75.jpg 75w, https://www.psypost.org/wp-content/uploads/2025/11/fx1_lrg-350x350.jpg 350w, https://www.psypost.org/wp-content/uploads/2025/11/fx1_lrg-750x750.jpg 750w" sizes="(max-width: 996px) 100vw, 996px"><figcaption class="wp-caption-text">[Graphical abstract via Cell Press]</figcaption></figure>
<p>These high-resolution analyses confirmed the presence of a dense and organized network of lymphatic vessels in the dura surrounding the middle meningeal artery. The vessels were identified by the presence of key proteins that act as markers for lymphatic cells. This anatomical evidence provided a physical basis for the slow drainage patterns observed in the MRI scans, linking the functional imaging data directly to cellular structures.</p>
<p>The researchers also noted that the structure and organization of these lymphatic vessels varied across different layers of the dura. This complexity suggests a sophisticated and compartmentalized system for fluid management. The findings expand the known map of the brain’s lymphatic drainage system, adding a key ventral, or bottom-facing, pathway to the previously documented dorsal pathways along the top of the brain.</p>
<p>It is important to note certain limitations of the research. The MRI portion of the study involved a small group of five individuals, and the histological analysis was conducted on tissue from a single donor. The imaging data provide indirect evidence of fluid movement based on the behavior of a contrast agent, rather than a direct measurement of flow. Future studies with larger and more diverse groups of participants will be needed to confirm and expand upon these findings.</p>
<p>The work establishes a new reference point for understanding the brain’s normal function. By charting how the healthy brain clears waste, researchers can better identify when and how this process becomes impaired. Albayram’s team is already applying these insights to study patients with neurodegenerative diseases, hoping to find new ways to diagnose and treat these conditions.</p>
<p>“A major challenge in brain research is that we still don’t fully understand how a healthy brain functions and ages,” said Albayram. “Once we understand what ‘normal’ looks like, we can recognize early signs of disease and design better treatments.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.isci.2025.113693" target="_blank">Meningeal lymphatic architecture and drainage dynamics surrounding the human middle meningeal artery</a>,” was authored by Mehmet Albayram, Sutton B. Richmond, Kaan Yagmurlu, Ibrahim S. Tuna, Eda Karakaya, Hiranmayi Ravichandran, Fatih Tufan, Emal Lesha, Melike Mut, Filiz Bunyak, Yashar S. Kalani, Adviye Ergul, Rachael D. Seidler, and Onder Albayram.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>