<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/genetic-link-found-between-suicide-risk-and-brain-structure-in-large-scale-study/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Genetic link found between suicide risk and brain structure in large-scale study</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 13th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People with a higher genetic predisposition to attempting suicide tend to show differences in brain structure, according to a new study published in the journal <em><a href="https://doi.org/10.1002/hbm.70220" target="_blank">Human Brain Mapping</a></em>. The researchers found that specific genetic markers associated with suicide attempt risk also overlap with those related to brain volume, particularly in subcortical regions involved in emotion, reward, and cognitive control.</p>
<p>The findings point to a small but statistically meaningful genetic correlation between suicide attempts and total brain volume, and suggest that shared genetic influences may be expressed in distinct ways across development. While previous studies have independently tied suicidal behavior and brain structure to genetic factors, this new research indicates that they may be more intertwined than previously understood.</p>
<p>Suicide attempt is one of the strongest predictors of suicide death and remains a pressing global health concern. Although environmental stressors, psychiatric conditions, and trauma history contribute to risk, there is growing recognition that suicide also has biological underpinnings. Large-scale genetic studies have identified heritable components of suicidal behavior, and structural brain changes have been reported in individuals with a history of suicide attempts.</p>
<p>However, researchers have yet to fully determine whether these biological features share a common genetic basis. If suicide risk and variations in brain morphology stem from overlapping genetic pathways, identifying those regions and gene sets could help reveal new targets for intervention or prediction. The new study aimed to clarify the degree to which the genetic architecture of suicide attempt overlaps with regional brain volume in both adults and adolescents.</p>
<p>“My colleagues and I were specifically interested in determining whether genetic risk for suicide behaviors could be reflected in neurodevelopmental differences early in life. Suicide behavior is very difficult to predict, and understanding risk factors for suicidal behavior early in development prior to the emergence of these behaviors could be one avenue towards prevention,” said study author <a href="https://rwjms.rutgers.edu/people/jill-rabinowitz" target="_blank">Jill A. Rabinowitz</a>, an assistant professor at Robert Wood Johnson Medical School at Rutgers University.</p>
<p>The research team used data from two of the largest genome-wide association studies available: one on suicide attempts, including nearly one million individuals, and one on brain imaging, which included structural MRI data from approximately 75,000 participants. The suicide attempt dataset included both people who had made nonfatal attempts and those who had died by suicide. The brain imaging data included measurements of total brain volume and nine subcortical regions, such as the caudate, putamen, amygdala, hippocampus, and thalamus.</p>
<p>To examine shared genetic factors, the researchers first applied a statistical technique known as linkage disequilibrium score regression, which estimates genome-wide genetic correlations between traits. Then, to look for specific areas of the genome influencing both suicide attempts and brain volumes, they used GWAS-pairwise analysis, which examines smaller segments of the genome to detect local genetic overlap. These segments were then mapped to genes using functional annotation tools.</p>
<p>To explore how these genetic relationships might emerge during adolescence, the researchers also examined data from over 5,000 European-ancestry participants in the Adolescent Brain Cognitive Development (ABCD) study. Using polygenic scores for suicide attempt—scores that summarize the cumulative effect of thousands of genetic variants—they tested whether higher genetic risk was associated with differences in brain volume in this younger cohort.</p>
<p>The strongest genome-wide genetic correlation emerged between suicide attempt risk and intracranial volume. The correlation was modest (r = -0.10) but statistically significant, suggesting that genetic factors associated with a higher likelihood of suicide attempt are also linked to smaller overall brain volume. This aligns with earlier neuroimaging research indicating that individuals with a history of suicide attempt tend to have smaller intracranial volume.</p>
<p>Zooming in on specific brain regions, the researchers identified ten genomic segments that appeared to influence both suicide attempt risk and at least one subcortical brain structure. Seven of these were associated with the thalamus, two with the putamen, and one with the caudate nucleus. These areas are involved in various cognitive, emotional, and motor processes, and have been implicated in psychiatric conditions such as depression and schizophrenia.</p>
<p>Several genes were mapped to these overlapping genomic segments. One of the most prominent was DCC, a gene involved in axonal guidance and synaptic development, which was associated with both suicide risk and volume of the caudate and putamen. Other implicated genes, including members of the histone cluster (e.g., HIST1H2BN and HIST1H4L), were located in a highly complex region of the genome known as the major histocompatibility complex, which is involved in immune function and has also been linked to psychiatric disorders.</p>
<p>In conditional analyses, the researchers found that associations between suicide risk and thalamic volume within this region were likely due to separate genetic signals rather than a single shared variant. This suggests that while the same region of the genome may influence both traits, it may do so through distinct genetic mechanisms.</p>
<p>In the adolescent sample from the ABCD study, higher polygenic risk for suicide attempt was significantly associated with smaller volume of the right nucleus accumbens, a brain region involved in reward sensitivity and motivation. This association remained significant even after correcting for multiple comparisons. Notably, the nucleus accumbens did not show overlap in the adult genomic segment analyses, suggesting that different brain regions may reflect genetic vulnerability at different stages of development.</p>
<p>“We found that people with higher genetic risk for suicide attempt tend to have smaller overall brain volume and differences in specific brain regions like the thalamus and caudate nucleus,” Rabinowitz told PsyPost. “In adolescents, a higher genetic risk for suicide attempt was also associated with a smaller volume in the nucleus accumbens, a region involved in reward and motivation. These findings suggest that certain brain structures may help explain how genetic risk for suicide is expressed in the brain early in life, offering insight for future prevention efforts.”</p>
<p>There are some limitations to consider. The genetic analyses were based exclusively on individuals of European ancestry, which means the results may not generalize to other populations. Future studies should aim to replicate these findings in more diverse samples.</p>
<p>Additionally, while the study identifies genetic overlap, it cannot determine whether these shared genetic factors causally influence both suicide risk and brain structure. It’s possible that genes influence one trait, which in turn affects the other. Future research using causal inference methods like Mendelian randomization could help clarify the direction of these relationships.</p>
<p>“It is important to note that findings are not causal; that is, we did not find that genetic risk causes brain structure differences, but rather that an association exists between genetic liability for suicide attempt and neurodevelopment,” Rabinowitz explained. “It will be important to consider third variables in future research, such as environmental exposures, that may be potential pathways through which genetic propensity for suicidal behavior is linked to brain structure differences. I look forward to continuing to conduct research that incorporates genetic and novel biobehavioral and neural phenotypes that may be associated with suicidal behavior across the lifespan.”</p>
<p>The study, “<a href="https://doi.org/10.1002/hbm.70220" target="_blank">Genetic Links Between Subcortical Brain Morphometry and Suicide Attempt Risk in Children and Adults</a>,” was authored by Zuriel Ceja, Luis M. García-Marín, I-Tzu Hung, Sarah E. Medland, Alexis C. Edwards, Miguel E. Rentería, and Jill A. Rabinowitz.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/his-psychosis-was-a-mystery-until-doctors-learned-about-chatgpts-health-advice/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">His psychosis was a mystery—until doctors learned about ChatGPT’s health advice</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 13th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A 60-year-old man arrived at a Seattle hospital convinced his neighbor was poisoning him. Though medically stable at first, he soon developed hallucinations and paranoia. The cause turned out to be bromide toxicity—triggered by a health experiment he began after consulting ChatGPT. The case, published in <em><a href="https://doi.org/10.7326/aimcc.2024.1260" target="_blank">Annals of Internal Medicine: Clinical Cases</a></em>, highlights a rare but reversible form of psychosis that may have been influenced by generative artificial intelligence.</p>
<p>Psychosis is a mental state characterized by a disconnection from reality. It often involves hallucinations, where people hear, see, or feel things that are not there, or delusions, which are fixed beliefs that persist despite clear evidence to the contrary. People experiencing psychosis may have difficulty distinguishing between real and imagined experiences, and may behave in ways that seem irrational or confusing to others. </p>
<p>Psychosis is not a diagnosis in itself, but a symptom that can appear in a variety of medical and psychiatric conditions, including schizophrenia, bipolar disorder, and severe depression. It can also be triggered by brain injuries, infections, or toxic substances.</p>
<p>The man’s initial presentation was unusual but not dramatic. He came to the emergency department reporting that his neighbor was trying to poison him. His vital signs and physical examination were mostly normal. Routine laboratory tests, however, revealed some striking abnormalities: extremely high chloride levels, a highly negative anion gap, and severe phosphate deficiency. Despite this, he denied using any medications or supplements.</p>
<p>As doctors searched for answers, his mental state worsened. Within a day, he was hallucinating and behaving erratically. He had to be placed on a psychiatric hold and was started on risperidone to manage his symptoms. But a deeper look at his bloodwork suggested a rare toxic condition: bromism.</p>
<p>Bromism occurs when bromide—a chemical similar to chloride—builds up in the body to toxic levels. Historically, bromide was used in sedatives and other medications, but it was phased out in the United States by the late 1980s. It is still used in some industrial and cleaning applications. In cases of bromism, bromide can interfere with chloride tests and cause neurological and psychiatric symptoms ranging from confusion to full-blown psychosis.</p>
<p>The man’s doctors consulted with Poison Control and eventually confirmed that bromism was the likely cause of his symptoms. After being stabilized with fluids and nutritional support, the man revealed a key detail: for the past three months, he had been replacing regular table salt with sodium bromide. His motivation was nutritional—he wanted to eliminate chloride from his diet, based on what he believed were harmful effects of sodium chloride.</p>
<p>This belief was strengthened, he explained, by information he had received from ChatGPT. While experimenting with ways to improve his health, he asked the chatbot whether chloride could be replaced. The model reportedly offered bromide as an alternative, without flagging any health risks or asking why the substitution was being considered. Encouraged by what he interpreted as scientific endorsement, he purchased sodium bromide online and began consuming it regularly.</p>
<p>Medical tests confirmed that his blood bromide level had reached 1700 mg/L—well above the normal range of 0.9 to 7.3 mg/L. After stopping the supplement and receiving supportive treatment, his psychiatric symptoms gradually subsided. He was weaned off risperidone before discharge and remained stable in follow-up appointments.</p>
<p>This unusual incident carries several caveats. A single case report cannot establish causation. There may have been multiple factors contributing to the man’s psychosis, and his exact interaction with ChatGPT remains unverified. The medical team does not have access to the chatbot conversation logs and cannot confirm the exact wording or sequence of messages that led to the decision to consume bromide.</p>
<p>Case reports, by nature, describe only one patient’s experience. They are not designed to test hypotheses or rule out alternative explanations. In some cases, rare outcomes may simply be coincidental or misunderstood. Without controlled studies or broader surveillance, it is difficult to know how common—or uncommon—such incidents truly are.</p>
<p>Despite their limitations, case reports often serve as early warning signs in medicine. They tend to highlight novel presentations, unexpected side effects, or emerging risks that are not yet widely recognized. Many medical breakthroughs and safety reforms have started with a single unusual case. In this instance, the authors argue that the use of AI-powered chatbots should be considered when evaluating unusual psychiatric presentations, especially when patients are known to seek health advice online.</p>
<p>The case also raises broader concerns about the growing role of generative AI in personal health decisions. Chatbots like ChatGPT are trained to provide fluent, human-like responses. But they do not understand context, cannot assess user intent, and are not equipped to evaluate medical risk. In this case, the bot may have listed bromide as a chemical analogue to chloride without realizing that a user might interpret that information as a dietary recommendation.</p>
<p>The idea that <a href="https://www.psypost.org/chatgpt-psychosis-this-scientist-predicted-ai-induced-delusions-two-years-later-it-appears-he-was-right/" target="_blank">chatbots could contribute to psychosis</a> once seemed speculative. But recent editorials and anecdotal reports suggest that this may be a real, if rare, phenomenon—especially among individuals with underlying vulnerability. Danish psychiatrist Søren Dinesen Østergaard was among the first to raise the alarm. In 2023, he published a warning in Schizophrenia Bulletin, suggesting that the cognitive dissonance of interacting with a seemingly intelligent but ultimately mechanical system could destabilize users who already struggle with reality-testing.</p>
<p>Since then, multiple stories have emerged of individuals who experienced dramatic changes in thinking and behavior after prolonged chatbot use. Some became convinced that they had divine missions, while others believed they were communicating with sentient beings. In one reported case, a man believed he had been chosen by ChatGPT to “break” a simulated reality. In another, a user’s romantic partner came to believe that the chatbot was a spiritual guide and began withdrawing from human relationships.</p>
<p>These stories tend to follow a pattern: intense engagement with the chatbot, increasingly eccentric beliefs, and a lack of pushback from the system itself. Critics point out that language models are trained to reward user satisfaction, which can mean agreeing with or amplifying the user’s worldview—even when it is distorted or delusional. That dynamic may mimic what psychiatrists call confirmation bias, a known contributor to psychotic thinking.</p>
<p>Some developers are exploring ways to detect when a conversation appears to touch on delusional thinking—such as references to secret messages or supernatural identity—and redirect users to professional help. But such systems are still in their infancy, and the commercial incentives for chatbot companies tend to prioritize engagement over safety.</p>
<p>The case of the Seattle man is a sobering reminder of how even a seemingly minor substitution—replacing salt with a chemical cousin—can spiral into a medical and psychiatric emergency when guided by decontextualized information. While AI chatbots have potential to support healthcare in structured settings, this report suggests they may also present hidden risks, especially for users who take their advice literally.</p>
<p>The study, “<a href="https://doi.org/10.7326/aimcc.2024.1260" target="_blank">A Case of Bromism Influenced by Use of Artificial Intelligence</a>,” was authored by Audrey Eichenberger, Stephen Thielke, and Adam Van Buskirk.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/drug-using-teens-show-distinct-patterns-of-brain-development-tied-to-dopamine-regulation/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Drug-using teens show distinct patterns of brain development tied to dopamine regulation</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 13th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1016/j.dcn.2025.101594" target="_blank">Developmental Cognitive Neuroscience</a></em> provides initial evidence that differences in brain chemistry during adolescence may help explain why some teens are more likely to experiment with drugs or alcohol—and why others appear to require stronger incentives to maintain cognitive focus. The findings suggest that slower developmental increases in a brain chemistry marker linked to dopamine functioning may be associated with both substance use and a greater dependence on rewards to perform well on cognitive tasks.</p>
<p>Adolescence is a period marked by novelty-seeking, heightened sensitivity to rewards, and risk-taking behaviors—including substance use. About 60% of teens will try alcohol, tobacco, or other drugs before adulthood, and those who begin during adolescence face a greater risk of developing a substance use disorder later in life.</p>
<p>Previous research has connected long-term substance use to changes in dopamine-related brain activity. For example, adults with substance use disorders tend to show lower availability of dopamine receptors and transporters in a brain region called the basal ganglia, which is involved in reward processing and cognitive control. However, much less is known about whether early changes in dopamine-linked neurodevelopment could help explain why some adolescents begin using substances in the first place.</p>
<p>Directly measuring dopamine in the brain is difficult, especially in younger participants. But scientists have identified a promising proxy: brain tissue iron. Iron is essential for dopamine synthesis and storage, and it tends to accumulate in dopamine-rich areas of the brain during adolescence. In this study, the researchers used magnetic resonance imaging to track tissue iron in the basal ganglia over time as a way to indirectly assess changes in dopamine-related brain development.</p>
<p>“We were interested in applying a new method for estimating functioning within a key neurotransmitter system. This functioning is typically difficult to measure in younger participants, but is thought to be critical for answering important questions about the propensity for early substance use,” explained Jessica S. Flannery, an assistant professor at the University of Georgia.</p>
<p>The research team followed 168 adolescents from sixth through eleventh grade, collecting brain scans at up to four timepoints between the ages of roughly 12 and 18. In total, they gathered 469 functional MRI sessions from participants in a socioeconomically and ethnically diverse community in the southeastern United States. Each year, participants self-reported their substance use and completed cognitive tasks while undergoing brain scans.</p>
<p>At the final timepoint, a subset of 76 participants also completed an incentive-boosted Go/No-Go task called the “Planets Task.” This task assessed cognitive control by asking participants to either press a button in response to certain visual stimuli or withhold a response to others. Performance was measured under three different reward conditions: no monetary reward, a small reward, and a large reward. This design allowed the researchers to examine how performance changed based on the incentive level.</p>
<p>To estimate brain iron, the researchers analyzed T2*-weighted MRI signals from four subregions of the basal ganglia: the caudate, putamen, pallidum, and nucleus accumbens. Lower T2* signal corresponds to higher iron concentration, which has been associated with more robust dopamine activity.</p>
<p>As expected, the researchers observed that tissue iron levels tended to increase across adolescence, consistent with normal neurodevelopment. However, adolescents who reported using substances—ranging from alcohol and marijuana to vaping or other drugs—showed a slower rate of increase in iron levels, especially in the nucleus accumbens. This region is thought to be involved in assigning motivational value to rewards and has been previously linked to substance use risk.</p>
<p>Teens who had never used substances showed a steeper age-related increase in nucleus accumbens iron than those who had. The difference was not explained by other demographic factors such as income, race, sex, or ADHD diagnosis. While it remains unclear whether lower iron accumulation reflects a cause or consequence of substance use, the findings align with the idea that teens with less dopamine-related activity may be more drawn to substances as a way to compensate for reduced sensitivity to natural rewards.</p>
<p>The study also explored how tissue iron levels were linked to performance on the incentivized cognitive control task. Although all participants improved their performance when rewards were introduced, some improved dramatically, while others showed little or no change. Teens who relied more on the incentives to boost their cognitive control—dubbed “incentive-dependent”—tended to have lower iron accumulation in the putamen, a part of the basal ganglia involved in motor control and task execution.</p>
<p>In contrast, teens whose performance was relatively stable across all reward conditions—“incentive-independent” individuals—showed stronger age-related increases in putamen tissue iron. These findings suggest that adolescents with lower dopamine-related activity in this region may need stronger external motivation to perform at the same level as their peers.</p>
<p>Interestingly, while incentive-related performance was linked to brain activity during the task, tissue iron levels were not directly associated with changes in incentive-related brain activation. This indicates that while both factors relate to motivation and behavior, they may operate through distinct processes.</p>
<p>The key takeaway? “Differences in how teens’ brains develop might help explain why some adolescents are more likely to engage in certain health-related behaviors than others,” Flannery told PsyPost.</p>
<p>Although the findings point to a possible neurodevelopmental pattern that relates to both early substance use and incentive-dependent cognitive control, the study does not prove causation. Because the researchers could not disentangle preexisting differences from the effects of substance use over time, it is still unclear whether reduced iron accumulation leads to substance use, or whether even mild early use might affect brain development.</p>
<p>The researchers also note that incentive-boosted cognitive control and brain activity were only measured at the final timepoint, limiting their ability to track developmental changes in task performance. In addition, while tissue iron is a useful proxy for dopamine-related physiology, it is not a direct measure of dopamine function. More research is needed to clarify how iron levels reflect changes in the broader dopamine system.</p>
<p>“It is important to note that this study did not directly assess brain tissue iron but instead relied on a magnetic resonance-based estimation,” Flannery added. “Further, while brain iron levels are associated with parts of the dopamine system such as dopamine transporters, receptors, and the enzymes that help produce dopamine, iron levels do not directly measure how much dopamine is available or exactly how it is functioning. Scientists are still working to understand how brain iron and dopamine activity are connected, as they reflect distinct but associated aspects of brain chemistry.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.dcn.2025.101594" target="_blank">Developmental changes in dopamine-related neurophysiology and associations with adolescent substance use and incentive-boosted cognitive control</a>,” Jessica S. Flannery, Ashley C. Parr, Kristen A. Lindquist, and Eva H. Telzer.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/machine-learning-helps-tailor-deep-brain-stimulation-to-improve-gait-in-parkinsons-disease/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Machine learning helps tailor deep brain stimulation to improve gait in Parkinson’s disease</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 12th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A team of researchers at the University of California, San Francisco has developed a data-driven method for optimizing deep brain stimulation (DBS) settings that significantly improved walking performance in people with Parkinson’s disease. Published in the journal <em><a href="https://doi.org/10.1038/s41531-025-00990-5" target="_blank">npj Parkinson’s Disease</a></em>, the study used wearable sensors and implanted neural recording devices to analyze how different DBS settings affected walking, then applied machine learning to identify individualized stimulation parameters that enhanced gait. The results indicate that tailored DBS configurations can improve walking stability and speed, and suggest specific brain activity patterns linked to better mobility.</p>
<p>Parkinson’s disease is a progressive neurological condition that affects movement control. It arises from the loss of dopamine-producing neurons in the brain, leading to symptoms such as tremors, stiffness, slowness, and impaired balance. Gait disturbances—such as short shuffling steps, poor coordination, and freezing episodes—are among the most disabling symptoms, especially in later stages of the disease.</p>
<p>DBS is a surgical treatment in which electrodes are implanted in specific areas of the brain, typically the basal ganglia. These electrodes deliver electrical impulses to regulate abnormal brain activity. While DBS can be highly effective at reducing tremors and stiffness, its effects on walking tend to be inconsistent. This variability is partly due to the complexity of walking as a behavior, but also due to the lack of standardized methods for fine-tuning stimulation settings for gait.</p>
<p>DBS programming typically focuses on improving limb-related motor symptoms and is often conducted while the patient is seated. However, walking requires coordination across multiple brain regions, and existing programming practices do not consistently address gait. Complicating matters further, clinicians must choose from a wide range of stimulation parameters—amplitude, frequency, and pulse width—without clear guidance on how these settings affect walking behavior or neural dynamics.</p>
<p>The research team, led by Hamid Fekri Azgomi and <a href="https://doriswanglab.ucsf.edu/" target="_blank">Doris D. Wang</a>, sought to overcome this challenge by designing a framework that integrates behavioral and neural data to guide DBS programming specifically for walking. Their goal was to uncover how different stimulation settings influenced both movement and underlying brain activity, and to create a predictive model that could identify optimal settings tailored to each individual.</p>
<p>“Gait disturbances are among the most disabling symptoms of Parkinson’s disease, severely affecting patients’ mobility, independence, and quality of life. While DBS has proven effective for alleviating other motor symptoms such as tremors and bradykinesia, its impact on gait remains unclear and inconsistent, making it challenging to determine optimal DBS settings for walking improvement,” said Wang, a functional neurosurgeon and an associate professor at UCSF and a faculty member in the UCSF–UC Berkeley Joint Graduate Program in Bioengineering.</p>
<p>“Recent advances in neurotechnology, including devices that can record brain activity while delivering stimulation in real time, have opened new opportunities to study the neural mechanisms underlying DBS. We launched this research to better understand how specific DBS parameters influence the brain circuits involved in walking. Our goal was to identify personalized stimulation settings that can help improve walking for individual patients with Parkinson’s disease.”</p>
<p>The study involved three people with Parkinson’s disease who had undergone DBS implantation targeting the globus pallidus (a part of the basal ganglia). Each participant also had electrodes placed over the motor cortex, a brain region involved in voluntary movement. These devices allowed simultaneous stimulation and recording of brain signals during walking tasks.</p>
<p>To capture the full range of motor behavior, participants were equipped with wearable sensors that tracked step length, stride speed, variability, and arm swing while walking in loops on a 6-meter track. The researchers developed a new measure called the Walking Performance Index (WPI), which combined these gait features into a single score that reflected overall walking ability.</p>
<p>The research team then systematically varied the DBS settings, adjusting amplitude, frequency, and pulse width within clinically safe ranges. For each configuration, participants completed walking trials while their brain activity and motion data were recorded. Subjective ratings from both patients and physical therapists were also collected.</p>
<p>A machine learning algorithm called a Gaussian Process Regressor was used to model the relationship between the stimulation parameters and WPI. This approach allowed the researchers to predict which combinations of settings would likely produce the best walking performance, even without testing every possible configuration. New predictions were tested in follow-up sessions and used to refine the model.</p>
<p>Each participant showed distinct patterns of gait improvement under different DBS configurations. The model successfully identified personalized settings that improved walking beyond what was achieved with standard clinical settings. One participant experienced an 18% improvement in walking performance with the new settings, while others showed smaller but meaningful gains.</p>
<p>“One surprising finding was the level of variability in how different DBS settings influenced gait. We expected some variation, but the extent to which each individual responded differently to specific stimulation patterns highlighted just how complex and patient-specific gait control is in Parkinson’s disease.”</p>
<p>In addition to behavioral data, the researchers analyzed neural signals from the brain during walking. They found that reductions in beta-band activity (a type of brain rhythm between 12–30 Hz) in the globus pallidus were consistently associated with better walking. These reductions were especially pronounced during specific phases of the gait cycle, such as when the opposite leg was bearing weight.</p>
<p>“The discovery that certain neural features were consistently associated with improved walking performance helped validate the potential of using brain signals to guide DBS therapy. These findings challenged the traditional one-size-fits-all approach and reinforced the importance of letting the brain guide therapy, highlighting the need for personalized, adaptive stimulation strategies informed by each patient’s neural activity.”</p>
<p>Importantly, while the precise patterns of improvement varied across individuals, the machine learning model was able to adapt to each person’s unique neural and motor responses. In follow-up periods, one participant voluntarily used the model-recommended settings for several hours each day, demonstrating the practicality of implementing these changes outside the lab.</p>
<p>The study highlights the importance of personalized DBS programming for addressing gait disturbances in Parkinson’s disease. Standard programming practices may not account for the complex and individual-specific brain dynamics involved in walking. By using real-time brain recordings and data-driven modeling, this approach offers a path toward more effective and targeted treatments.</p>
<p>“The most important takeaway is that brain stimulation for Parkinson’s disease can, and should, be personalized. Our study shows that gait improvement is not just about turning DBS on or off, but about finding the right settings designed to everyone’s brain activity and walking patterns.” </p>
<p>“By modeling the relationship between stimulation parameters, neural activity, and gait performance, we demonstrate the potential for data-driven, individualized DBS therapies that go beyond standard approaches. This opens the door to more precise and effective treatment strategies, particularly for challenging symptoms like gait dysfunction that have not responded consistently to conventional DBS. This work lays the foundation for adaptive DBS systems that adjust therapy in real time, based on how the patient’s brain and body respond, bringing us one step closer to intelligent neuromodulation in everyday care.”</p>
<p>The study was limited by its small sample size—only three participants were included. While the results are promising, larger studies are needed to confirm the findings and assess how well this approach generalizes to more diverse patient populations. In addition, the study focused on straight walking; future work should explore how turning, freezing, and obstacle navigation respond to tailored stimulation.</p>
<p>“While our results are promising and highlight the potential of personalized DBS to improve gait in Parkinson’s disease, it is important to recognize that this study was conducted in a small cohort of patients. These findings do not imply that a single DBS setting will universally restore gait. Instead, the takeaway is that data-driven, individualized approaches, grounded in both neural signals and behavioral metrics, can offer a more systematic and responsive way to optimize therapy.”</p>
<p>The researchers hope to expand their work by incorporating longer walking trials, larger datasets, and more advanced algorithms. They envision future DBS systems that use neural biomarkers and machine learning to continuously adapt stimulation in real-time, improving mobility throughout the day. This could dramatically reduce the burden on patients and clinicians during programming sessions and improve quality of life for people with Parkinson’s disease.</p>
<p>“We hope this work lays the foundation for intelligent, adaptive DBS systems that continuously adjust therapy based on a patient’s real-time brain and movement signals. The identified gait-optimized DBS settings have the potential to inform future adaptive DBS designs and move clinical practice beyond static programming toward closed-loop systems that respond to each individual’s dynamic needs throughout the day. Additionally, the neural features associated with improved walking performance could support clinicians during programming visits, making the process more efficient and objective. Similar data-driven approaches could be extended to optimize treatment for a broader range of motor and non-motor symptoms.”</p>
<p>“This study represents a collaborative effort between clinicians, engineers, and neuroscientists, highlighting the value of interdisciplinary work in advancing personalized neuromodulation therapies. Our findings offer a step toward more adaptive DBS systems for gait dysfunction and illustrate the power of integrating neural signals, behavioral metrics, and machine learning to tailor treatments to individual needs.”</p>
<p>The study, “<a href="https://www.nature.com/articles/s41531-025-00990-5" target="_blank">Modeling and optimizing deep brain stimulation to enhance gait in Parkinson’s disease: personalized treatment with neurophysiological insights</a>,” was authored by Hamid Fekri Azgomi, Kenneth H. Louie, Jessica E. Bath, Kara N. Presbrey, Jannine P. Balakid, Jacob H. Marks, Thomas A. Wozny, Nicholas B. Galifianakis, Marta San Luciano, Simon Little, Philip A. Starr, and Doris D. Wang.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/larger-social-networks-associated-with-reduced-dementia-risk/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Larger social networks associated with reduced dementia risk</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 12th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A systematic review of studies on dementia and social networking found that individuals in better cognitive health tend to have larger and more integrated networks of social contacts. Those with more extensive social networks were more likely to demonstrate cognitive resilience and less likely to develop dementia. The paper was published in <a href="https://doi.org/10.1016/j.neuroscience.2025.04.019"><em>Neuroscience</em></a>.</p>
<p>Dementia is a condition marked by a significant decline in cognitive abilities that interferes with daily life. There are different types of dementia, but the most common is Alzheimer’s disease. People with dementia typically experience memory loss, particularly for recent events, while older memories often remain clearer for longer. They also tend to have difficulty with language, problem-solving, and reasoning. Changes in mood, personality, and behavior are common as the condition progresses.</p>
<p>Dementia results from damage to brain cells, which disrupts communication between them. Although it occurs more frequently in older adults, it is not considered a normal part of aging. Some types of dementia are caused by treatable conditions such as vitamin deficiencies or thyroid problems and can be reversed if detected early. However, most types are progressive and incurable, so treatment focuses on symptom management, slowing cognitive decline, and supporting both patients and caregivers.</p>
<p>Study author Faheem Arshad and his colleagues aimed to integrate findings from studies examining the relationship between social networking and dementia. Previous research has suggested that social networking—defined as the recognition and maintenance of meaningful social connections—may offer protection against cognitive decline. In other words, individuals with richer social networks were less likely to develop dementia. Components of these networks include marital status, the number of people one is in contact with, the frequency of those interactions, satisfaction with those relationships, and perceived support.</p>
<p>The authors analyzed the results of 17 studies published between 2000 and 2024. The mean age of participants ranged from 40 to 90 years. Six studies were conducted in the United States, three in Germany, two in the United Kingdom, and one each in China, France, Sweden, Ireland, Iceland, and India. Thirteen of the studies included follow-up periods ranging from one to fifteen years. In total, the review synthesized data from 20,678 participants.</p>
<p>The studies consistently reported that individuals with poorer social networks were more likely to develop dementia. In contrast, those with larger and more integrated social networks were less likely to be diagnosed with dementia during the follow-up periods.</p>
<p>Four longitudinal studies examined how qualities of social networks at a baseline time point were associated with cognitive decline or dementia onset over time. These studies found higher rates of cognitive decline among participants with smaller or less integrated networks. Some studies also reported that individuals with better social networks tended to show healthier brain structures over time. This was especially evident in the amygdala, a brain region involved in emotion and social behavior, which appeared to be better preserved in those with more robust social networks.</p>
<p>“Our systematic review suggests a strong association between poor SN [social networking] and increased risk of dementia and cognitive decline, especially in AD [Alzheimer’s disease] patients. Hence, larger, more integrated social networks contribute to cognitive resilience and reduced disease severity,” the study authors concluded.</p>
<p>The study contributes to the scientific knowledge about dementia. However, it should be noted that the design of the studies included in this analysis does not allow any definitive causal inferences to be derived. While it is possible that good social networks protect from dementia, it is also possible that people in better cognitive health are better able to maintain social contacts producing the observed associations.</p>
<p>The paper, “<a href="https://doi.org/10.1016/j.neuroscience.2025.04.019">Association between social networking and dementia: A systematic review of observational studies,</a>” was authored by Faheem Arshad, Deenadayalan Boopalan, Sonali Arora, Howard J. Rosen, and Suvarna Alladi.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-research-reveals-what-makes-self-forgiveness-possible-or-out-of-reach/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New research reveals what makes self-forgiveness possible or out of reach</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 12th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Some people struggle to forgive themselves because they remain stuck in cycles of guilt, shame, and self-condemnation that threaten their sense of identity and agency, according to a study published in <a href="https://doi.org/10.1080/15298868.2025.2513878"><em>Self & Identity</em></a>.</p>
<p>Why do some people find it nearly impossible to forgive themselves, even years after a mistake? While self-forgiveness is considered a key step in emotional healing, the path to forgiving oneself can be deeply complex. Researchers have found that excessive guilt and shame are linked to many forms of psychological distress, including depression and anxiety, but the actual experience of being “stuck” in self-blame has been understudied. In the current research, Lydia Woodyatt and colleagues fill this gap by exploring the lived experiences of people who either had or had not been able to forgive themselves for a perceived wrongdoing.</p>
<p>The authors were particularly interested in a theoretical tension between two basic psychological needs: agency (the ability to control and influence one’s life) and social-moral identity (the need to see oneself as a good person). Previous studies suggest that failing to meet either of these needs can block self-forgiveness. For instance, people may avoid taking responsibility in order to protect their moral identity, but doing so can leave them feeling powerless. Conversely, accepting full responsibility may preserve a sense of agency but lead to overwhelming shame. To understand how these conflicting needs play out in real life, the authors adopted a qualitative, narrative-based approach.</p>
<p>The researchers recruited a diverse community sample of 80 adults across the United States using Amazon Mechanical Turk. Participants were prompted to recall and describe either a time they were able to forgive themselves for wrongdoing, or a time they were unable to do so. Of the final group, 41 participants described being unable to forgive themselves, while 39 described being able to. The events spanned a range of contexts, including interpersonal betrayals, personal failures, harm to others, and more. The authors also collected demographic information such as age (21-79 years), gender (53.8% female), and ethnicity (68.8% White).</p>
<p>Participants answered a series of open-ended questions, including prompts such as why they felt the need to forgive themselves, what strategies they used to try to do so, and how they felt about the event now. The average response time was longer for those unable to self-forgive (10 minutes) compared to those who had self-forgiven (7 minutes), suggesting greater cognitive and emotional complexity. Researchers conducted a reflexive thematic analysis, supplemented by inter-rater reliability checks, to identify key psychological patterns in the responses.</p>
<p>Four central themes emerged. First, participants who could not forgive themselves often described the event as if it were still happening. The past felt vividly present and replayed in their minds with intense emotional weight. These individuals remained “stuck,” feeling as if they had not moved forward in life. In contrast, those who had self-forgiven emphasized a shift in focus toward the future. They still acknowledged regret, but no longer felt consumed by it, and described an active decision to release the emotional hold of the past.</p>
<p>Second, issues of personal agency emerged as a defining feature. Those unable to self-forgive frequently alternated between feeling responsible and trying to deny or downplay their role. This tension was especially acute when the wrongdoing involved caring for others, such as in cases of parental regret, accidental harm, or even after being victimized themselves. In contrast, those who were able to forgive themselves accepted both responsibility and their human limitations. For them, self-forgiveness did not mean letting themselves off the hook but recognizing what they could and could not control.</p>
<p>A third theme involved social-moral identity. Participants who remained stuck in self-condemnation often described feeling incompatible with their own moral self-image. They questioned whether they were “good” people and sometimes resorted to self-punishment to reinforce this internal conflict. Meanwhile, participants who had forgiven themselves tended to accept that being a good person could coexist with having made mistakes. They reframed the experience as a lesson and sometimes even used it to recommit to important values, such as being a better parent or friend.</p>
<p>Finally, coping strategies differed between the two groups. Those who had not self-forgiven typically used avoidance, trying to distract themselves or suppress painful thoughts. While this offered short-term relief, it often prolonged emotional distress. On the other hand, those who had forgiven themselves described a painful but productive process of “working through” their guilt. This included allowing themselves to feel the full emotional impact of their actions, talking with others, and making meaning from the event. For them, the goal wasn’t just to feel better, but to understand themselves better.</p>
<p>The authors note that their analysis was shaped by their own theoretical lenses and that participant responses may have been influenced by how questions were framed. The sample was also limited to English-speaking U.S. adults, which may not capture cultural variations in guilt, shame, and forgiveness.</p>
<p>This study highlights that self-forgiveness often requires navigating complex psychological tensions and engaging deeply with one’s values, emotions, and identity. By understanding these conflicts, clinicians and researchers can better support individuals who are stuck in self-blame.</p>
<p>The research, “<a href="https://doi.org/10.1080/15298868.2025.2513878">What makes self-forgiveness so difficult (for some)? Understanding the lived experience of those stuck in self-condemnation</a>,” was authored by Lydia Woodyatt, Melissa de Vel-Palumbo, Anna Barron, Christiana Harous, Michael Wenzel, and Shannon de Silva.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/one-in-four-people-with-mood-disorders-show-internal-circadian-misalignment/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">One in four people with mood disorders show internal circadian misalignment</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 12th 2025, 11:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Young people with emerging mood disorders often show signs of internal circadian misalignment—a disruption in the timing between key biological rhythms—and those with greater misalignment tend to report more severe depressive symptoms. That’s according to new research published in the <em><a href="https://doi.org/10.1177/07487304251349408" target="_blank" rel="noopener">Journal of Biological Rhythms</a></em>, which used lab-based physiological measurements to examine how circadian signals like melatonin, cortisol, and core body temperature interact in the context of mental health.</p>
<p>Circadian rhythms are biological cycles that operate on a roughly 24-hour schedule. They govern various physiological and behavioral processes such as sleep, hormone secretion, body temperature, and alertness. These rhythms are largely regulated by the brain’s internal clock, located in the hypothalamus, which synchronizes with environmental cues like light and darkness.</p>
<p>When circadian rhythms function in harmony, processes like falling asleep and waking up tend to follow predictable patterns. But when these rhythms become misaligned—either with each other or with the external environment—it can affect sleep quality, energy levels, and even mental health. In recent years, scientists have become increasingly interested in how disruptions in circadian timing might be involved in mood disorders such as depression.</p>
<p>Previous research has linked circadian dysfunction to mood problems, but most studies have focused on individual markers—such as melatonin onset or sleep-wake cycles—rather than looking at the system as a whole. A few small studies have suggested that the timing between different biological rhythms may be out of sync in people with depression. However, these studies have been limited by small sample sizes and a narrow focus on just two markers at a time.</p>
<p>The current study was designed to test whether young people with mood disorders tend to show “internal misalignment”—that is, mismatches in the timing of several biological rhythms within the body. The researchers also wanted to know whether greater degrees of misalignment were associated with more severe depressive symptoms.</p>
<p>“We know from previous research that there is a lot of evidence for links between mood disorders (like depression and bipolar disorder) and disturbances in circadian rhythms (i.e. the 24-hour ‘body clock’), including abnormal sleep-wake patterns, altered energy and fatigue, and differences in 24-hour rhythms of things like hormone secretion (such as melatonin and cortisol) and core body temperature,” said study author Joanne Carpenter, a research fellow at the Youth Mental Health and Technology Team at the Brain and Mind Centre at the University of Sydney.</p>
<p>“However, most of the previous research in this area looks at how well the body clock is aligned with the environment around us. For instance, the hormone melatonin (the ‘darkness hormone’) usually starts being produced by our bodies a couple of hours before our normal bedtime, but in those with mood disorders, this might occur later than normal, putting them ‘out of sync’ in the same way that we might get out of sync by travelling across time zones.”</p>
<p>“What intrigued us was that the findings were not always consistent across studies or in studies that used different markers as outputs of the body clock. Most of the research on the body clock in mood disorders has only really looked at the timing of one biological measure at a time, so we were interested in looking at multiple outputs of the body clock to see whether the timing of these rhythms was just out of sync with the environment, or also out of sync with each other.”</p>
<p>The researchers collected data from 69 young people (ages 16 to 35) who were seeking mental health care and compared them to 19 healthy control participants. The group with mood disorders included individuals with depression, bipolar disorder, anxiety, or other psychiatric conditions. Participants were excluded if they had sleep disorders, neurological conditions, recent travel across time zones, or were using certain medications.</p>
<p>Each participant underwent multiple assessments. They wore wrist-based actigraphy monitors for several days to record sleep-wake patterns. Then, they spent a night in a sleep lab, where researchers tracked the timing of three key circadian markers: the onset of melatonin secretion under dim light conditions, the peak of salivary cortisol levels after waking, and the lowest point of core body temperature. These are all known to be regulated by the body’s internal clock. Participants also completed clinical assessments, including the Hamilton Depression Rating Scale to measure the severity of depressive symptoms.</p>
<p>Using this data, the researchers calculated “phase angles”—the time differences between each pair of rhythms (for example, how many hours after melatonin onset the core body temperature reached its lowest point). They defined internal circadian misalignment as any phase angle that fell more than two standard deviations away from the average values found in the control group.</p>
<p>About 23% of the young people with mood disorders showed signs of internal circadian misalignment. These individuals had unusual timing relationships between at least one pair of biological markers—most commonly involving melatonin and core body temperature.</p>
<p>Interestingly, those with internal misalignment did not differ from other participants in terms of diagnosis, medication use, or overall sleep duration. However, they did tend to show later melatonin onset times relative to other rhythms, suggesting a delayed signal from one part of the circadian system.</p>
<p>Across the full group of young people with mood disorders, the researchers found that certain types of misalignment were associated with more severe depressive symptoms. Specifically, those whose core body temperature dropped earlier in the night—relative to either melatonin onset, cortisol peak, or sleep timing—tended to report higher levels of depression. These associations held even after accounting for age and sex.</p>
<p>“This highlights that — at least for some people — a poorly synchronised clock may be relevant to their mood, and so it may be really important for us to look at how we can target this in treatment and prevention strategies,” Carpenter told PsyPost.</p>
<p>While the researchers initially hypothesized that delayed rhythms would be most strongly linked to depression, they found that earlier-than-expected temperature drops may also be involved. This suggests that disruptions in either direction—whether rhythms are too early or too late—can contribute to mood disturbances. The specific patterns of misalignment varied widely across individuals, highlighting the complexity of circadian disruptions in mood disorders.</p>
<p>“We were interested to find that the specific nature of the circadian abnormalities was not the same for everyone with mood disorders, with a lot of variation in individual patterns (e.g., one individual may have a late melatonin rhythm and an early temperature rhythm, whereas another may have the opposite),” Carpenter said. “This challenges us to think more about what the different causes might be for clocks to get out of sync in different ways, and whether we might need different approaches to correcting specific circadian problems.”</p>
<p>The study offers some of the strongest evidence to date that internal circadian misalignment may play a role in mood problems among young people. Still, several limitations suggest the findings should be interpreted with caution.</p>
<p>For instance, the study was cross-sectional, meaning it only provides a snapshot in time. It cannot determine whether circadian misalignment causes depression, results from it, or whether both are influenced by other factors. Longitudinal studies will be needed to clarify whether correcting circadian misalignment can improve mental health outcomes.</p>
<p>“We only measured these circadian rhythms and mood symptoms at one point in time, so we can’t say from this whether there is any causal relationship with the internal jet lag leading to the mood symptoms or vice versa,” Carpenter noted. “We also don’t yet know much about whether those who are out of sync can be re-synchronised with specific treatments or if this would help these individuals to have better mood outcomes.”</p>
<p>The lab-based methods used to measure circadian rhythms are not easily scalable for widespread clinical use. Each participant had to spend a night in a dimly lit lab while providing saliva samples and wearing temperature sensors. While this approach provides high-quality data, it is labor-intensive and difficult to replicate outside of research settings.</p>
<p>“Measuring circadian rhythms in this way (an in-lab overnight study) takes a lot of time and resources and can be quite a burden to participants,” Carpenter said. “There is a lot of promise for digital and wearable innovation (e.g., activity tracking watches, mobile apps) that may lead to easier or better ways to study circadian rhythms. We are excited to see these develop and hope that it may help this research to be better translated to real-world applications.”</p>
<p>Despite these caveats, the findings from this study support the idea that internal circadian misalignment is common in young people with mood disorders and tends to be linked to greater depressive symptoms. While the exact patterns of misalignment vary, the results indicate that disrupted timing between biological rhythms—especially melatonin, cortisol, and core body temperature—may play an important role in emotional well-being.</p>
<p>“We hope that this will lead to further investigation of how internal circadian alignment changes over time in those with mood disorders — do people become more in or out of sync over time, what drives those changes, and how does it relate to their mood?” Carpenter explained. “Ultimately, we hope that this will lead to avenues for better identifying those with circadian disturbances and informing how circadian-focused strategies or treatments can help improve our approaches to prevention and intervention.”</p>
<p>The study, “<a href="https://doi.org/10.1177/07487304251349408" target="_blank" rel="noopener">Evidence for Internal Misalignment of Circadian Rhythms in Youth With Emerging Mood Disorders</a>,” was authored by Joanne S. Carpenter, Jacob J. Crouse, Mirim Shin, Emiliana Tonini, Gabrielle Hindmarsh, Zsofi de Haan, Frank Iorfino, Rebecca Robillard, Sharon Naismith, Elizabeth M. Scott, and Ian B. Hickie</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>