<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/bright-light-therapy-linked-to-mood-improvements-and-brain-connectivity-changes/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Bright light therapy linked to mood improvements and brain connectivity changes</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 24th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new neuroimaging study from China has found that an eight-week course of bright light therapy helped reduce depressive symptoms in individuals with subthreshold depression. The treatment also altered dynamic functional connectivity in several brain regions associated with mood regulation. The study was published in the <a href="https://www.sciencedirect.com/science/article/abs/pii/S0165032725000473"><em>Journal of Affective Disorders</em></a>.</p>
<p>Subthreshold depression refers to the presence of depressive symptoms that are clinically relevant but do not meet the full diagnostic criteria for major depressive disorder. Individuals with subthreshold depression may experience persistent sadness, fatigue, sleep disturbances, or concentration problems, but with fewer symptoms or a shorter duration than required for a formal diagnosis.</p>
<p>Despite being “subthreshold,” the condition can impair daily functioning and reduce quality of life. It is also linked to an increased risk of developing major depression in the future. Subthreshold depression is common—especially among adolescents, older adults, and individuals with chronic illnesses—and it often goes undiagnosed and untreated because the symptoms are perceived as mild or situational. However, research shows that even mild depressive symptoms can negatively affect social relationships, job performance, and physical health.</p>
<p>Study author Guixian Tang and his colleagues aimed to examine the effects of bright light therapy on both depressive symptoms and brain function in individuals with subthreshold depression. Bright light therapy involves exposure to artificial light that mimics natural sunlight and is commonly used to regulate circadian rhythms and improve mood. While it is best known as a treatment for seasonal affective disorder, it has also shown benefits for non-seasonal depression, sleep disorders, and circadian rhythm disruptions. The authors hypothesized that it might also be effective for subthreshold depression.</p>
<p>The study involved 95 students from Jinan University in China. Participants were between 18 and 28 years old, ethnically Han Chinese, and right-handed. They were included in the study if they had a mild, nonseasonal depressive symptom profile, with scores meeting specific thresholds on two established depression measures.</p>
<p>Participants were randomly assigned to one of two groups. One group received bright light therapy: daily 30-minute exposure to a 5000 lux light box before noon for eight weeks. The placebo group used an identical-looking device that emitted only a dim light (less than 5 lux).</p>
<p>Before and after the treatment period, participants completed standardized assessments of depression (the Hamilton Depression Rating Scale and the Center for Epidemiologic Studies Depression Scale) and anxiety (the Hamilton Anxiety Scale). They also underwent resting-state functional magnetic resonance imaging (rs-fMRI) scans, with a focus on dynamic functional connectivity in the cingulate cortex—a brain region involved in emotional regulation.</p>
<p>The results showed that participants who received bright light therapy had significantly greater reductions in depressive symptoms than those in the placebo group. Neuroimaging data revealed that bright light therapy led to increased dynamic functional connectivity between:</p>
<ul>
<li>The right supracallosal anterior cingulate cortex and the right temporal pole,</li>
<li>The left middle cingulate cortex and the right insula,</li>
<li>The left supracallosal anterior cingulate cortex and the pons.</li>
</ul>
<p>Conversely, the therapy decreased dynamic connectivity between the right supracallosal anterior cingulate cortex and the right middle frontal gyrus.</p>
<p>Importantly, increases in connectivity between the right supracallosal anterior cingulate cortex and the right temporal pole were positively associated with reductions in depressive symptoms, suggesting a link between brain network changes and mood improvement.</p>
<p>“BLT [bright light therapy] alleviates depressive symptoms and changes the CC dFC [cingulate cortex dynamic functional connectivity] variability in StD [subthreshold depression], and pre-treatment dFC variability of the CC could be used as a biomarker for improved BLT treatment in StD. Furthermore, dFC changes with specific neurotransmitter systems after BLT may underline the antidepressant mechanisms of BLT.”, study authors concluded.</p>
<p>The study sheds light on the effects of bright light treatment on subthreshold depression. However, the study was conducted on a small group of university students. Results on larger groups and groups that include older individuals might differ.</p>
<p>The paper “<a href="https://doi.org/10.1016/j.jad.2025.01.035">Effects of bright light therapy on cingulate cortex dynamic functional connectivity and neurotransmitter activity in young adults with subthreshold depression</a>” was authored by Guixian Tang, Pan Chen, Guanmao Chen, Zibin Yang, Wenhao Ma, Hong Yan, Ting Su, Yuan Zhang, Shu Zhang, Zhangzhang Qi, Wenjie Fang, Lijun Jiang, Qian Tao, and Ying Wang.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/people-with-dark-triad-traits-gain-others-trust-through-facial-attractiveness/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">People with Dark Triad traits gain others’ trust through facial attractiveness</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 24th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Despite being associated with manipulation and self-interest, individuals who score high on the so-called Dark Triad personality traits may have a surprising social advantage: people tend to trust them more, based solely on their facial appearance. A new study published in <em><a href="https://www.sciencedirect.com/science/article/abs/pii/S019188692500176X" target="_blank" rel="noopener">Personality and Individual Differences</a></em> suggests that in short-term interactions, individuals high in these traits are perceived as more trustworthy and are more likely to be trusted with resources, even by complete strangers.</p>
<p>The Dark Triad refers to a group of three personality traits—narcissism, Machiavellianism, and psychopathy—that share a common core of manipulation, callousness, and self-centeredness. Narcissism is characterized by an inflated sense of self-importance and a strong desire for admiration. Machiavellianism involves a strategic, manipulative approach to interpersonal relationships, often with a focus on self-interest and deception. Psychopathy includes traits such as impulsivity, a lack of empathy or remorse, and antisocial behavior.</p>
<p>While much previous research has focused on the harmful consequences of these traits—such as dishonesty, aggression, or emotional manipulation—some studies have suggested that individuals high in Dark Triad traits may also possess social advantages, particularly in competitive or high-stakes environments. These individuals often excel at impression management, exhibit high levels of confidence, and can be highly persuasive. This has led some researchers to explore whether these traits could be adaptive in certain social situations, helping individuals gain influence or trust, at least temporarily.</p>
<p>“Our research was inspired by numerous stories and real-life figures who seem to possess an uncanny ability to gain others’ trust despite having questionable intentions,” said Qi Wu, an associate professor in the Department of Psychology, School of Educational Science at Hunan Normal University and the corresponding author of the study.</p>
<p>“Contemporary examples, such as the current U.S. President Donald Trump, illustrate how individuals with certain personality characteristics can effectively navigate social dynamics to their advantage. This paradox between negative personality traits and social success in competitive environments prompted us to investigate the underlying mechanisms scientifically.”</p>
<p>The researchers conducted four separate studies to investigate whether individuals with high levels of Dark Triad traits are perceived as more trustworthy based solely on their facial appearance—and whether that perception translates into actual trusting behavior.</p>
<p>In the first study, the researchers aimed to test whether people perceive individuals with high Dark Triad traits as more trustworthy based on facial appearance alone. To begin, they recruited 577 adult Chinese volunteers, who completed a standardized personality assessment measuring the Dark Triad traits. Based on these scores, the researchers selected 40 individuals—20 with the highest and 20 with the lowest Dark Triad scores (balanced across gender)—to serve as facial stimuli. Each individual had a neutral facial photo taken under controlled conditions, with no makeup, accessories, or facial expressions. These images were edited to remove background elements and converted to grayscale to ensure consistency.</p>
<p>The study’s main sample consisted of 156 adult Chinese participants, who completed a trustworthiness rating task online. Each participant viewed the 40 facial images—only once—and rated how trustworthy each person appeared on a scale from 1 (not at all trustworthy) to 7 (extremely trustworthy). The images were presented in a random order, and participants were unaware of the individuals’ actual personality scores.</p>
<p>The results showed that participants rated individuals with high Dark Triad traits as more trustworthy than those with low traits. This was surprising, given that Dark Triad traits are generally linked to manipulation and deceit. The finding suggested that people might be misled by facial cues in short-term interactions, forming inaccurate judgments about a person’s character.</p>
<p>While Study 1 measured trustworthiness perceptions, the second study investigated whether those perceptions translated into actual behavior. The researchers used a classic trust game to simulate short-term cooperation. In this game, participants were given a hypothetical amount of money (100 CNY) and had to decide how much to invest in a partner, based solely on that partner’s facial photograph. Whatever amount they chose to invest would be tripled, and then the partner (in reality, hypothetical) could return any portion of it—or keep it all. The amount participants chose to invest served as a measure of their trust.</p>
<p>The study involved 149 adult Chinese participants. As in Study 1, each participant viewed the same 40 faces (20 high Dark Triad, 20 low Dark Triad) and made an investment decision for each one, with faces presented in random order. Participants received no feedback after each round and were told that one interaction would be randomly selected to determine their final payout, increasing the stakes of each decision.</p>
<p>The results mirrored those of Study 1. Participants invested more money in individuals with high Dark Triad traits, despite knowing nothing about them other than their face. This suggested that the influence of facial features extended beyond perception and into actual trusting behavior.</p>
<p>In the third study, the researchers sought to understand why faces associated with high Dark Triad traits were perceived as more trustworthy. They focused on three possible mediating traits: facial attractiveness, perceived dominance, and perceived extraversion. The sample included 204 adult Chinese participants, none of whom had taken part in the previous studies.</p>
<p>Participants completed two tasks. First, they repeated the trustworthiness rating task from Study 1. Then, in a separate session, they rated the same 40 facial images on three dimensions—attractiveness, dominance, and extraversion—on a 7-point scale. Each face was rated once for each trait, with the order randomized. The researchers then used mediation analysis to test whether these perceived traits explained the relationship between the targets’ actual Dark Triad scores and how trustworthy their faces were judged to be.</p>
<p>They found that people with high Dark Triad traits were perceived as more attractive and more dominant, but not more extraverted. However, only perceived attractiveness mediated the link between Dark Triad traits and trustworthiness ratings. In other words, people were more likely to see high Dark Triad individuals as trustworthy largely because they also saw them as more attractive. Even after accounting for attractiveness, dominance, and extraversion, the facial features of high Dark Triad individuals still had a direct effect on how trustworthy they were perceived to be.</p>
<p>The final study combined the approaches of Studies 2 and 3 to see whether facial attractiveness, dominance, or extraversion mediated the relationship between Dark Triad traits and trust behavior in the trust game. This study involved 158 adult Chinese participants, who completed both the trust game and the facial evaluation task.</p>
<p>First, participants played the trust game, just as in Study 2. Then, they rated the same 40 faces on the three dimensions of attractiveness, dominance, and extraversion. The researchers performed another mediation analysis to see whether these perceived traits explained the investment behavior observed in the game.</p>
<p>Again, the results supported the role of attractiveness. Participants invested more in individuals with high Dark Triad traits, and this was partly due to higher ratings of facial attractiveness. Dominance and extraversion did not significantly affect investment behavior. Importantly, the researchers found that even when controlling for these traits, high Dark Triad individuals still received more trust, suggesting a direct influence of facial cues related to the Dark Triad.</p>
<p>“What surprised us most was that perceived dominance and extraversion did not mediate the relationship between Dark Triad traits and trust, while facial attractiveness did play a mediating role,” Wu told PsyPost. “Even more intriguing was our discovery of a direct pathway from Dark Triad facial features to trust judgments, independent of these other perceived traits. This suggests that Dark Triad personality characteristics and trustworthy-appearing facial features may have co-evolved as complementary strategies for social manipulation and resource acquisition.”</p>
<p>Taken together, the four studies suggest that individuals with high levels of socially exploitative personality traits may possess a distinct social advantage — particularly in short-term settings where trust must be established quickly. Their facial features are not only perceived as more attractive but also elicit greater levels of trust, which could help explain their success in gaining resources or influence, despite their tendency toward manipulative behavior.</p>
<p>The researchers interpret these findings through an evolutionary lens. In short-term cooperative interactions, such as one-time exchanges, individuals who appear trustworthy and attractive can benefit quickly before any deceit or selfishness is discovered. This could mean that individuals with Dark Triad traits have evolved — or learned — to present themselves in ways that maximize short-term gain. In contrast, in long-term relationships, where trust is built over time and deception is harder to maintain, the disadvantages of these traits may become more apparent.</p>
<p>“The most important takeaway is that people can be deceived by facial appearance alone, even in brief encounters,” Wu explained. “Our research reveals that individuals with high Dark Triad traits are perceived as more trustworthy based solely on their facial features, and this perception translates into actual trusting behaviors. This suggests that our intuitive judgments about trustworthiness from faces may not always be accurate, particularly in short-term interactions where these individuals may have evolutionary advantages.”</p>
<p>But as with all research, there are limitations. “We conducted this research mainly within a Chinese cultural context,” Wu noted. “If this trust effect is indeed a result of Dark Triad evolutionary adaptation, it should be observable across broader cross-cultural settings to confirm its universality. Additionally, we used static facial photographs, whereas real-world interactions involve dynamic expressions, gestures, and behaviors that might produce different effects. Future research should examine these phenomena in more naturalistic settings with dynamic stimuli.”</p>
<p>“We aim to extend this research cross-culturally to test the universality of these effects and explore how they manifest in long-term versus short-term cooperative contexts. We’re particularly interested in investigating the specific facial features that contribute to this phenomenon and examining how each component of the Dark Triad individually influences trust perception and behavior.”</p>
<p>Despite these caveats, the study reveals how certain personality traits — even those commonly viewed as antisocial — can be socially advantageous under the right conditions.</p>
<p>“This research highlights the sophisticated nature of human social perception and the potential evolutionary arms race between deception and detection mechanisms,” Wu said. “While these findings may seem concerning, understanding these dynamics can help people make more informed judgments about trust and cooperation in both personal and professional contexts.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.paid.2025.113214" target="_blank" rel="noopener">Trust in Darkness: Individuals with high dark triad traits gain others’ trust through facial attractiveness and other associated facial features</a>,” was authored by Siwei Zhang, Qi Wu, Jia Liu, Kejian Peng, Yu Liang, and Huiying Li.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/people-who-eat-more-ultra-processed-foods-show-more-early-signs-of-parkinsons-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">People who eat more ultra-processed foods show more early signs of Parkinson’s, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 24th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study has found that people who consume high amounts of ultra-processed foods are more likely to exhibit early symptoms associated with Parkinson’s disease than those who consume less. The findings, published in <em><a href="https://www.neurology.org/doi/10.1212/WNL.0000000000213562" target="_blank">Neurology</a></em>, suggest that dietary habits may be linked to nonmotor symptoms that often occur years before Parkinson’s disease is diagnosed. However, the researchers emphasized that their study does not prove cause and effect.</p>
<p>Parkinson’s disease is a neurodegenerative disorder that progresses slowly, often beginning with subtle, nonmotor symptoms such as sleep disturbances, constipation, and mood changes before movement-related symptoms appear. This early phase, known as the prodromal phase, can last for over a decade. Since early interventions could potentially slow or delay the onset of Parkinson’s disease, researchers have been increasingly interested in identifying risk factors that arise long before diagnosis. Diet is one such candidate. </p>
<p>Previous research has shown that high consumption of ultra-processed foods is associated with an increased risk of dementia, prompting scientists to explore whether similar patterns exist for Parkinson’s disease.</p>
<p>“We previously showed that people with poor diet quality had a high future risk of Parkinson’s disease, suggesting the important role of diet in the development of the condition. Recently, ultra-processed food, which are strongly associated with poor diet quality, have been shown to be associated with dementia, another major neurodegenerative disease. However the relation between ultra-processed food and Parkinson’s disease remained unknown,” said study author <a href="https://sph.fudan.edu.cn/employee/158" target="_blank">Xiang Gao</a> of the Institute of Nutrition at Fudan University.</p>
<p>To investigate this, the researchers analyzed data from two large, long-running studies: the Nurses’ Health Study and the Health Professionals Follow-Up Study. These studies have tracked health and lifestyle factors in tens of thousands of U.S. adults over several decades. For the current research, the team focused on a sample of 42,853 participants (about 59% women, with an average age of 48 years). They excluded individuals who had already been diagnosed with Parkinson’s disease, were over the age of 85, or had missing or implausible dietary data.</p>
<p>Dietary habits were assessed using food frequency questionnaires collected every two to four years. Participants reported how often they ate various foods, which were then categorized using the NOVA classification system. This system groups foods into categories based on how much processing they undergo, with ultra-processed foods including items like sugary cereals, packaged snacks, sweetened beverages, and processed meats. The researchers calculated average long-term intake of ultra-processed foods and examined how it related to a set of seven early indicators of Parkinson’s disease.</p>
<p>These early symptoms, or prodromal features, included probable REM sleep behavior disorder (where people physically act out their dreams), constipation, reduced sense of smell, color vision impairment, daytime sleepiness, unexplained body pain, and depressive symptoms. Each symptom counted as one point, and the researchers compared participants with no symptoms to those with one, two, or three or more.</p>
<p>The results showed a clear pattern: people who consumed the most ultra-processed foods were significantly more likely to show multiple prodromal symptoms. Compared to those who ate the least, individuals in the highest fifth of ultra-processed food consumption were about 2.5 times more likely to exhibit three or more early signs of Parkinson’s disease. This association remained even after adjusting for factors like age, physical activity, smoking, caffeine, alcohol intake, overall diet quality, and total calorie intake.</p>
<p>When the researchers looked at individual symptoms, they found that high ultra-processed food consumption was linked to greater odds of having sleep behavior disorder, constipation, body pain, and depressive symptoms. However, there was less consistent evidence for an association with reduced sense of smell, color vision issues, or daytime sleepiness.</p>
<p>To better understand whether these associations could be driven by specific types of foods, the team also analyzed subcategories of ultra-processed foods. They found that packaged sweet snacks, processed meats, artificially sweetened drinks, dairy-based desserts, and sauces or condiments were particularly associated with the presence of prodromal symptoms.</p>
<p>Importantly, the study was designed to reduce the possibility that people already experiencing early symptoms had changed their diets in response. To do this, the researchers looked only at dietary data collected at least six years before the earliest assessment of prodromal symptoms. They also conducted additional analyses using dietary information from as far back as 1986 and found similar associations.</p>
<p>“Healthy dietary patterns, high in fruits and vegetables and low in ultra-processed foods, could be beneficial against risk of Parkinson’s disease, an incurable neurodegenerative disease,” Gao told PsyPost.</p>
<p>But the researchers cautioned that their findings cannot prove that ultra-processed foods directly contribute to the development of Parkinson’s disease. Because this was an observational study, it’s possible that other unmeasured factors—such as environmental exposures or genetic risk—could explain the association.</p>
<p>There are also limitations in how diet and symptoms were measured. Diet was self-reported, which can introduce errors or biases. The categorization of foods as “ultra-processed” can be imprecise, especially for items that contain both processed and unprocessed ingredients. In addition, not all participants completed assessments for every prodromal feature, and the sample was mostly composed of White health professionals, which may limit how widely the results apply.</p>
<p>Nevertheless, the authors suggest that their findings are consistent with other research linking poor diet quality to neurological decline. They also highlight several possible biological mechanisms that could explain the relationship between ultra-processed foods and brain health. These foods often contain additives, artificial sweeteners, and packaging-related chemicals that have been shown in laboratory studies to increase inflammation, oxidative stress, and damage to dopamine-producing neurons—the same type of brain cells that are lost in Parkinson’s disease.</p>
<p>Another potential explanation involves the gut. Ultra-processed foods are typically low in dietary fiber and can disrupt the gut microbiome, which plays a role in inflammation and may be involved in Parkinson’s-related processes. Some researchers believe that Parkinson’s disease might begin in the gut and travel to the brain via the vagus nerve, a hypothesis supported by emerging evidence.</p>
<p>While more research is needed, especially interventional studies that test whether reducing ultra-processed food intake can prevent or delay Parkinson’s disease, the current findings provide one more reason to be mindful of diet. In particular, focusing on whole foods—such as fruits, vegetables, legumes, and unprocessed grains—might not only benefit general health but also reduce the risk of neurological decline later in life.</p>
<p>The study, “<a href="https://doi.org/10.1212/WNL.0000000000213562" target="_blank">Long-Term Consumption of Ultraprocessed Foods and Prodromal Features of Parkinson Disease</a>,” was authored by Peilu Wang, Xiao Chen, Muzi Na, Mario H. Flores-Torres, Kjetil Bjornevik, Xuehong Zhang, Xiqun Chen, Neha Khandpur, Sinara Laurini Rossato, Fang Fang Zhang, Alberto Ascherio, and Xiang Gao </p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/mindfulness-may-be-a-window-into-brain-health-in-early-alzheimers-risk/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Mindfulness may be a window into brain health in early Alzheimer’s risk</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 23rd 2025, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in the journal <a href="https://link.springer.com/article/10.1007/s12671-024-02500-9"><em>Mindfulness </em></a>has found that older adults with early signs of cognitive impairment show reduced levels of dispositional mindfulness, and that this decline may be linked to changes in brain connectivity. Specifically, the researchers identified weaker functional connections in the ventromedial prefrontal cortex—a brain region involved in emotional regulation—as a predictor of lower mindfulness in individuals with mild cognitive impairment.</p>
<p>The study was conducted to better understand how mindfulness traits change in the early stages of cognitive decline, before a diagnosis of dementia is made. The researchers focused on two groups: individuals with subjective cognitive decline, who report memory issues despite normal test performance, and those with mild cognitive impairment, who show measurable deficits but still maintain independence in daily life. Both groups are at increased risk of developing Alzheimer’s disease, and identifying early behavioral or neurological changes could aid in prevention and intervention efforts.</p>
<p>Mindfulness, often cultivated through practices like meditation, is also considered a stable personality trait. People naturally vary in their ability to remain present and aware of their thoughts and feelings. This capacity, known as dispositional mindfulness, has been linked to improved attention, emotional regulation, and brain health in prior studies involving healthy individuals. However, little research has explored how this trait may change in people experiencing early cognitive decline, or how it relates to brain function and structure.</p>
<p>To investigate these questions, the research team recruited 79 older adults—48 with subjective cognitive decline and 31 with mild cognitive impairment—from the Czech Brain Aging Study. All participants underwent cognitive assessments, brain imaging, and a behavioral test of mindfulness called the Breath Counting Task. In this task, participants were instructed to count their breaths in cycles of nine over an 18-minute period, with accuracy measured through physiological data. Unlike questionnaires, this task offers a direct behavioral measure of mindfulness.</p>
<p>The researchers also conducted advanced brain imaging, including structural and functional magnetic resonance imaging. They focused on specific brain regions previously linked to mindfulness and cognitive decline: the hippocampus (important for memory), the posterior cingulate cortex (involved in mind-wandering), the rostral anterior cingulate cortex (linked to self-awareness), and the ventromedial prefrontal cortex (associated with emotional appraisal and decision-making). Connectivity between brain regions was examined using resting-state functional imaging, while brain structure was assessed through volumetric and diffusion-weighted scans.</p>
<p>The results showed that participants with mild cognitive impairment had significantly lower mindfulness scores on the Breath Counting Task than those with subjective cognitive decline. This group difference remained statistically significant even after adjusting for age and other factors. Interestingly, differences in cognitive performance on traditional memory and attention tests did not predict mindfulness scores in either group. This suggests that the ability to maintain mindful attention may reflect distinct mental processes not captured by standard cognitive tests.</p>
<p>One of the most important findings was that greater functional connectivity in the ventromedial prefrontal cortex predicted higher mindfulness scores in individuals with mild cognitive impairment. This relationship was not found in those with subjective cognitive decline, indicating a possible compensatory mechanism in more impaired individuals. In other words, people in the early stages of measurable cognitive loss may need to rely more heavily on this brain region to sustain mindful awareness.</p>
<p>No significant associations were found between mindfulness scores and brain structure in any of the examined regions, including the hippocampus and posterior cingulate cortex. This was somewhat surprising given prior research linking mindfulness training to changes in gray matter volume in these areas. The authors suggest that structural brain differences may be less sensitive to subtle variations in dispositional mindfulness, or that longer-term meditation practice may be required to influence brain anatomy.</p>
<p>The study’s findings underscore the potential of mindfulness as both a behavioral marker and a modifiable factor in cognitive aging. Because mindfulness has been linked to reduced stress and improved emotional coping, higher levels of dispositional mindfulness may help buffer against the effects of cognitive decline. Moreover, the fact that individuals with mild impairment showed lower mindfulness but still maintained the ability to engage in the breath-counting task suggests that mindfulness interventions could be feasible and beneficial for this population.</p>
<p>One of the strengths of the study is its use of a behavioral measure of mindfulness rather than relying solely on self-reported questionnaires, which can be biased or less reliable in older adults. The Breath Counting Task provides a quantifiable index of attentional stability, which may be especially relevant in assessing early cognitive changes. Additionally, the integration of multiple brain imaging methods allowed the researchers to examine both functional and structural correlates of mindfulness.</p>
<p>However, the study was cross-sectional, meaning it cannot determine whether lower mindfulness causes cognitive decline, results from it, or simply reflects parallel changes. Longitudinal studies are needed to understand how mindfulness and brain function co-evolve over time in aging individuals. Another limitation was the lack of biomarker data to confirm whether participants with mild impairment were on a trajectory toward Alzheimer’s disease. Including such data in future research could help clarify whether the observed changes are specific to this condition.</p>
<p>The study, “<a href="https://doi.org/10.1007/s12671-024-02500-9">Present Mind in the Ageing Brain: Neural Associations of Dispositional Mindfulness in Cognitive Decline</a>,” was authored by Rastislav Šumec, Pavel Filip, Martin Vyhnálek, Stanislav Katina, Dusana Dorjee, Jakub Hort, and Kateřina Sheardová.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/judgments-of-breast-attractiveness-show-surprising-consistency-across-gender-race-and-orientation/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Judgments of breast attractiveness show surprising consistency across gender, race, and orientation</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 23rd 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People from different demographic backgrounds tend to agree on what they consider attractive when it comes to breast aesthetics, according to a new study published in the <a href="https://doi.org/10.1016/j.bjps.2024.01.002"><em>Journal of Plastic, Reconstructive & Aesthetic Surgery</em></a>. While small differences were observed based on gender, sexual orientation, and race, the patterns of preference were strongly correlated across all groups, suggesting some consistency in aesthetic judgments.</p>
<p>The study was designed to explore how different demographic groups perceive breast attractiveness. In recent years, there has been growing awareness that aesthetic ideals may not be universally held, and that factors like race, gender identity, and sexual orientation can shape beauty standards.</p>
<p>In plastic and reconstructive surgery, understanding these nuances has become increasingly important, especially when guiding patients through aesthetic decisions. While earlier work has proposed objective formulas for the “ideal” breast, such as certain volume ratios or nipple positions, this study asked whether preferences really are shared equally across groups.</p>
<p>To answer this question, the research team assembled a set of 25 pre-surgical photographs of patients who had presented for breast surgery at a plastic surgery clinic. These photographs, taken from a front-facing view and cropped to focus only on the torso area, included a diverse range of breast sizes, shapes, skin tones, and nipple-areola configurations. Patients had not undergone prior breast procedures, and all had given consent for their images to be used in de-identified research. About half of the patients were White or Caucasian, with others identifying as Asian, Black or African American, or Hispanic.</p>
<p>The researchers then created an online survey using the Qualtrics platform, distributing the images to a demographically representative sample of the United States population. Participants were asked to rate each pair of breasts on a five-point Likert scale, ranging from “least attractive” to “most attractive.” In addition to these ratings, respondents provided demographic information, including their sex, gender identity, race, and sexual orientation. In total, 1,021 people completed the survey, with a nearly even split between men and women.</p>
<p>The findings revealed several patterns. On average, male participants rated breasts as more attractive than female participants did, giving an average rating of 2.8 compared to 2.5 on the five-point scale. Although men gave higher scores overall, their preferences were still strongly correlated with those of women. In other words, both sexes tended to agree on which breasts were more or less attractive, even if men were more generous with their scores.</p>
<p>A similar pattern appeared when analyzing responses based on sexual orientation. People who reported being attracted to women gave higher ratings across the board than those attracted only to men. Those who were attracted to both men and women gave ratings comparable to respondents attracted only to women and significantly higher than those only attracted to men. This suggests that attraction to women may increase the perceived attractiveness of female breasts, regardless of the rater’s own gender.</p>
<p>Racial differences in ratings were more complex. White or Caucasian respondents gave higher average ratings than Asian respondents, with scores of 2.7 and 2.2 respectively. However, the difference between White and Black or African American respondents was not statistically significant. Importantly, across all racial groups, the relative rankings of breast attractiveness were highly correlated. That means that even when one group tended to give lower average ratings, they still agreed with others on which images were more or less attractive.</p>
<p>The study also explored whether people rated breasts from their own racial group more favorably than others. There was no evidence for this kind of in-group bias. For example, White respondents did not give higher ratings to breasts from White patients, nor did Black or Asian respondents favor patients from their own racial group. Three of the five highest-rated breasts belonged to White or Caucasian patients, while one came from a Black or African American patient and one from an Asian patient. This finding suggests that skin tone alone did not drive differences in ratings, although the researchers noted that respondents could still visually perceive differences in skin color in the photographs.</p>
<p>Although the study provides new insight into how people perceive breast aesthetics, it does have several limitations. The sample of photographed patients was relatively small and not fully representative of the racial and ethnic diversity found across the United States. All images came from individuals who visited the same surgeon, and the final selection was made by that surgeon, potentially introducing bias into the photo set.</p>
<p>Another important consideration is that survey participants were not asked why they rated breasts the way they did. Future studies could explore the reasoning behind individual ratings, perhaps by asking participants to comment on specific features like nipple position, breast symmetry, or fullness of the upper and lower pole. New technologies, such as artificial intelligence-generated imagery, might also help future research isolate variables more effectively by creating standardized images that differ in only one feature at a time.</p>
<p>Despite these caveats, the findings suggest that personal identity plays a role in shaping aesthetic preferences. While demographic groups may show different average preferences, the high correlations between them indicate broad agreement on the relative appeal of different breast appearances.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.bjps.2024.01.002">Differential preferences in breast aesthetics by self-identified demographics assessed on a national survey</a>,” was authored by Carter J. Boyd, Jonathan M. Bekisz, Kshipra Hemal, Thomas J. Sorenson, and Nolan S. Karp.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/neuroscientists-challenge-dopamine-detox-trend-with-evidence-from-avoidance-learning/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Neuroscientists challenge “dopamine detox” trend with evidence from avoidance learning</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 23rd 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://www.cell.com/current-biology/fulltext/S0960-9822(25)00435-X" target="_blank" rel="noopener">Current Biology</a></em> sheds light on how the brain learns to avoid harmful situations, revealing that dopamine—commonly associated with pleasure and reward—also plays a flexible and complex role in helping us sidestep danger. Researchers at Northwestern University found that two subregions of the brain’s reward center respond differently to negative experiences, and these responses shift over time as learning progresses. The results suggest that dopamine isn’t just about seeking rewards—it also helps shape our behavior in response to unpleasant experiences, with implications for understanding anxiety, depression, and obsessive-compulsive disorder.</p>
<p>The study was designed to investigate how dopamine contributes to learning from negative experiences, particularly when it comes to avoiding them. While previous research has shown that dopamine can respond to threats or discomfort, it has been unclear how these signals evolve over time and whether they differ by brain region. The research team wanted to understand how the brain adapts when outcomes are predictable and controllable, and how this learning might go awry in psychiatric conditions that involve excessive avoidance behaviors.</p>
<p>To explore these questions, the researchers conducted experiments with mice using a behavioral task designed to measure avoidance learning. Mice were placed in a two-chamber apparatus and given a five-second warning—consisting of a tone and a light—before a mild footshock would be delivered. If the mouse moved to the other chamber during the warning, the shock was avoided. If it stayed put, the shock occurred but stopped as soon as the mouse moved. This setup allowed the team to measure both avoidance and escape behaviors over several days of training.</p>
<p>The researchers recorded dopamine activity in two specific parts of the brain’s reward system: the core and the ventromedial shell of the nucleus accumbens. Using advanced fiber photometry techniques and genetically encoded dopamine sensors, they tracked how dopamine levels changed in response to the warning cue, the shock, and the mouse’s movement across chambers. This allowed them to monitor how learning unfolded over time and how different brain regions contributed to this process.</p>
<p>The results showed that the two brain regions processed aversive learning in distinct ways. In the ventromedial shell, dopamine levels initially surged in response to the shock itself. As the mice began to associate the warning cue with the impending shock, dopamine activity shifted to respond to the cue. But as the animals became more proficient at avoiding the shock, the dopamine response in this region faded. This suggests that the ventromedial shell plays a role in early learning and in identifying when something unpleasant is about to happen.</p>
<p>In contrast, the core of the nucleus accumbens showed a different pattern. Dopamine levels in this area decreased in response to both the warning cue and the shock. As the mice improved at avoiding the shock, the drop in dopamine in response to the cue became stronger. This suggests that the core is involved in refining avoidance behaviors as the animal becomes more skilled. The researchers found that dopamine signals in the core were especially tied to the animal’s actions, suggesting a role in guiding learned movement patterns during avoidance.</p>
<p>These patterns also shifted depending on the controllability of the outcome. After the mice had mastered avoiding the shock, the researchers changed the task so that the shock occurred regardless of the animal’s behavior. Under these conditions, dopamine responses reverted to earlier patterns, indicating that the brain’s learning signals are sensitive to whether a threat can be avoided. This flexibility may be important for helping animals adjust their behavior when the environment changes.</p>
<p>Importantly, the findings help explain why some people may struggle to accurately assess threats or may engage in excessive avoidance, as seen in anxiety and obsessive-compulsive disorders. Alterations in dopamine signaling could lead to exaggerated perceptions of danger, making it harder to adapt when situations change or when risks are no longer present. Understanding these processes could eventually inform treatments for these conditions.</p>
<p>The study also challenges popular ideas about dopamine, including the trend known as the “dopamine detox,” which suggests that avoiding pleasurable activities can reset the brain’s reward system. According to the researchers, this view oversimplifies dopamine’s role. “Dopamine is not all good or all bad,” said Gabriela Lopez, the study’s first author. ‘It rewards us for good things but also helps us tune into cues that signal trouble, learn from consequences and continuously adapt our learning strategies in unstable environments.”</p>
<p>Talia Lerner, the study’s senior author, emphasized that dopamine’s flexibility is key. “These responses are not only different in their sign — where in one area, dopamine goes up for something bad and, in the other area, it goes down for something bad — but we also saw that one is important for early learning while the other one is important for later-stage learning,” she explained.</p>
<p>While the findings are promising, the researchers note that their work was conducted in mice and may not fully translate to humans without further study. In addition, although the researchers monitored a range of behaviors and brain responses, the precise molecular mechanisms behind these patterns are still being investigated. Future research could explore how dopamine responses differ across individuals, how they are altered in psychiatric conditions, and whether interventions targeting specific brain circuits could reduce excessive avoidance behaviors.</p>
<p>The team also plans to explore how dopamine responses are shaped by experiences such as chronic stress, drug withdrawal, or persistent pain—conditions that involve altered learning and avoidance. By understanding how dopamine shapes behavior in the face of negative outcomes, scientists hope to better address mental health problems that disrupt people’s ability to function in everyday life.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.cub.2025.04.006" target="_blank" rel="noopener">Region-specific nucleus accumbens dopamine signals encode distinct aspects of avoidance learning</a>,” was authored by Gabriela C. Lopez, Louis D. Van Camp, Ryan F. Kovaleski, Michael D. Schaid, Venus N. Sherathiya, Julia M. Cox, and Talia N. Lerner.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/non-right-handedness-is-more-common-across-multiple-mental-health-conditions/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Non-right-handedness is more common across multiple mental health conditions</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 23rd 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People with mental or neurodevelopmental disorders are more likely to show non-right-handedness than those without these conditions, according to a second-order meta-analysis published in <a href="https://doi.org/10.1037/bul0000471"><em>Psychological Bulletin</em></a>.</p>
<p>Handedness—the tendency to prefer one hand over the other—is rooted in our biology and brain organization. Most people are right-handed, but left- and mixed-handed individuals comprise a significant minority. Handedness is shaped by both genetic and environmental factors, and emerges early in development, even prenatally. It is also associated with brain lateralization, especially in regions involved in language. Because some mental and neurodevelopmental disorders are also linked to altered brain asymmetry, researchers have suspected that <a href="https://www.psypost.org/meta-analysis-suggests-psychopathy-may-be-an-adaptation-rather-than-a-mental-disorder/">handedness might reflect</a> underlying neurocognitive differences relevant to these conditions.</p>
<p>Julian Packheiser and colleagues examined this possibility more systematically by conducting a second-order meta-analysis: essentially, a meta-analysis of meta-analyses.</p>
<p>The researchers first identified 10 relevant meta-analyses that examined conditions such as ADHD, autism, depression, dyslexia, dyscalculia, intellectual disability, PTSD, pedophilia, stuttering, and schizophrenia. Each meta-analysis included studies comparing individuals diagnosed with a given condition to healthy control groups, and provided data on whether participants were right-handed, left-handed, or mixed-handed.</p>
<p>The research team updated each of these existing meta-analyses by searching for and incorporating newly published studies, adding 33 additional datasets to the original 369, for a total of 402 datasets spanning over 202,000 individuals.</p>
<p>The final dataset encompassed a broad range of mental and neurodevelopmental disorders, and included detailed information about participant age, sex ratio, handedness classification methods, and geographical location of studies. Data were extracted and reanalyzed using a consistent statistical pipeline. Only studies with both clinical and control groups, clear handedness reporting, and no handedness-based participant selection were included. Key moderator variables were also coded, such as whether the disorder was neurodevelopmental, whether it involved language-related symptoms, and the typical age of onset.</p>
<p>Across all studies, individuals with mental or neurodevelopmental conditions were more likely to show atypical hand preferences, meaning they were either left-handed or mixed-handed, than healthy controls. The overall odds ratio indicated that people with these conditions were about 1.5 times more likely to be non-right-handed.</p>
<p>When looking specifically at left-handedness and mixed-handedness, both were significantly more common in clinical groups, with mixed-handedness showing the strongest association. However, these trends varied substantially depending on the disorder. Schizophrenia, autism spectrum disorder, and intellectual disability showed the most pronounced associations with atypical handedness, while conditions like depression and dyscalculia showed no significant differences compared to controls.</p>
<p>When the researchers explored potential moderating factors, several patterns emerged. Disorders that are classified as neurodevelopmental—such as ADHD, autism, dyslexia, dyscalculia, intellectual disability, and stuttering—showed significantly higher rates of non-right-handedness compared to non-neurodevelopmental disorders.</p>
<p>Additionally, conditions associated with language difficulties were more strongly linked to atypical handedness, supporting the idea that disruptions in brain asymmetry may affect both language and motor function. Among the non-neurodevelopmental conditions, those with an earlier average age of onset, such as schizophrenia and PTSD, also showed elevated rates of non-right-handedness. These patterns suggest that early brain development may play a key role in shaping both handedness and vulnerability to certain psychiatric conditions.</p>
<p>Taken together, the results reveal that handedness differences are not uniformly present across all mental health conditions, but instead appear to cluster in disorders with strong neurodevelopmental components or those that impact language processing and early brain development.</p>
<p>One limitation is that analyses relied solely on categorical measures of hand preference (i.e., right, left, or mixed), due to limited availability of continuous handedness measures. This may have constrained the sensitivity of some analyses.</p>
<p>The research, “<a href="https://doi.org/10.1037/bul0000471">Handedness in Mental and Neurodevelopmental Disorders: A Systematic Review and Second-Order Meta-Analysis</a>,” was authored by Julian Packheiser, Jette Borawski, Gesa Berretz, Sarah Alina Merklein, Marietta Papadatou-Pastou, and Sebastian Ocklenburg.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/withdrawal-symptoms-are-common-after-stopping-antidepressants/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Withdrawal symptoms are common after stopping antidepressants</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 23rd 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://www.nature.com/articles/s41380-024-02782-4" target="_blank">Molecular Psychiatry</a></em> estimates that about 43% of people who stop taking antidepressants experience withdrawal symptoms. These symptoms typically appear within two weeks of stopping the medication and can range from mild to severe. The findings suggest that withdrawal effects are widespread and highlight the need for better guidance on how to discontinue antidepressants safely.</p>
<p>Antidepressants are some of the most widely prescribed medications for mental health conditions such as depression and anxiety. While they can be effective, many people remain on them for extended periods. In the United States, nearly half of antidepressant users have been taking them for over five years. In the United Kingdom, the majority have used them for more than two years. Despite this long-term use, withdrawal symptoms that can follow discontinuation are not fully understood. Previous reports have varied widely in estimating how common these symptoms are—ranging from as low as 3% to as high as 77%.</p>
<p>To address this uncertainty, researchers led by Mi-Mi Zhang and colleagues at Peking University conducted the most comprehensive analysis to date on antidepressant withdrawal symptoms. Their goal was to estimate how often these symptoms occur, what they typically look like, and what factors increase the risk of experiencing them. They also aimed to provide clearer evidence to guide clinicians and patients when it comes to tapering off these medications.</p>
<p>The researchers systematically reviewed six major medical databases and included 35 studies in their final analysis. These studies included randomized controlled trials, as well as observational and cross-sectional research. In total, the analysis covered data from tens of thousands of individuals who had stopped taking antidepressants. Some studies tracked patients in clinical settings, while others relied on self-reports from online surveys. Across these different types of studies, the researchers consistently found that withdrawal symptoms were common.</p>
<p>The pooled incidence of antidepressant withdrawal symptoms across all studies was 42.9%. Among randomized controlled trials, the rate was slightly higher at 44.4%. Symptoms typically emerged within two weeks after stopping the medication and were generally measured for less than four weeks. However, some long-term users reported that symptoms lasted for months or even years. For example, one online survey found that “brain zaps”—a sudden electrical shock-like sensation in the head—could persist for decades.</p>
<p>The severity of symptoms also varied. Most people experienced mild to moderate symptoms, especially after short-term use of 8 to 12 weeks. However, a significant minority reported severe or even very severe symptoms, especially those who had been on antidepressants for a longer time. In one study, nearly half of the participants who discontinued venlafaxine after 8 weeks experienced moderate or severe symptoms.</p>
<p>The types of withdrawal symptoms also depended somewhat on the class of antidepressant. For selective serotonin reuptake inhibitors (SSRIs), the most commonly reported symptoms included dizziness, increased dreaming or nightmares, irritability, and anxiety. For serotonin-norepinephrine reuptake inhibitors (SNRIs), neurological symptoms like dizziness were more common. Tricyclic antidepressants had the highest estimated withdrawal rate (around 60%), but were less frequently studied.</p>
<p>Although the study found that tapering the dose over time was associated with a lower incidence of withdrawal symptoms compared to abruptly stopping the medication (34.5% vs. 42.5%), the difference was not statistically significant. This may be due to the relatively short tapering periods used in most studies—typically just 2 to 4 weeks. Some experts argue that much longer tapering schedules may be needed, especially for people who have been on antidepressants for years.</p>
<p>Several risk factors were associated with an increased likelihood of experiencing withdrawal symptoms. These included being female, younger age, experiencing early side effects during treatment, higher doses, longer treatment durations, and abrupt cessation. There is also some evidence that genetic differences, such as variation in a serotonin receptor gene, may play a role.</p>
<p>Interestingly, the researchers found that psychological factors—such as expecting to feel worse after stopping the medication—were not the main drivers of withdrawal symptoms. In randomized controlled trials, the group that actually stopped taking the medication had much higher rates of symptoms than those who continued treatment, suggesting that the symptoms were not just caused by expectations.</p>
<p>Despite the robust findings, the study does have limitations. Most of the included studies followed patients for only a few weeks after discontinuation, which may underestimate the true duration and severity of symptoms. There was also substantial variation in how withdrawal symptoms were measured and defined, which may affect the accuracy of the estimates. The authors also noted the possibility of publication bias, meaning studies with higher rates of withdrawal symptoms might have been more likely to be published.</p>
<p>Another key limitation is that most of the studies focused on people who had taken antidepressants for relatively short periods—often less than three months. This is not representative of typical long-term users, who may be more prone to withdrawal symptoms and whose experiences may differ in important ways. The researchers emphasize the need for future studies that track withdrawal in long-term users over extended follow-up periods.</p>
<p>Despite these limitations, the study provides strong evidence that withdrawal symptoms are a common experience for people discontinuing antidepressants. The findings challenge the notion that stopping these medications is usually straightforward and stress the importance of patient education and individualized discontinuation plans.</p>
<p>The authors recommend that clinicians inform patients of the possibility of withdrawal symptoms when starting an antidepressant and monitor them closely during and after discontinuation. They also call for better research into long-term use and more refined tapering strategies, which could help reduce the risks associated with stopping antidepressants.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41380-024-02782-4" target="_blank">Incidence and risk factors of antidepressant withdrawal symptoms: a meta-analysis and systematic review</a>,” was authored by Mi-Mi Zhang, Xuan Tan, Yong-Bo Zheng, Na Zeng, Zhe Li, Mark Abie Horowitz, Xue-Zhu Feng, Ke Wang, Zi-Yi Li, Wei-Li Zhu, Xinyu Zhou, Peng Xie, Xiujun Zhang, Yumei Wang, Jie Shi, Yan-Ping Bao, Lin Lu, and Su-Xia Li.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>