<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/religious-belief-linked-to-lower-anxiety-and-better-sleep-in-israeli-druze-study/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Religious belief linked to lower anxiety and better sleep in Israeli Druze study</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 11th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A study of individuals from the Israeli Druze community found that religious individuals tend to report better sleep quality and experience less anxiety. Statistical modeling suggests that religiosity may reduce anxiety, which, in turn, contributes to better sleep quality. The paper was published in the <a href="https://doi.org/10.1111/jsr.70055"><em>Journal of Sleep Research</em></a>.</p>
<p>Sleep is a natural and essential biological process that allows the body and brain to rest, repair, and restore. It plays a critical role in memory consolidation, emotional regulation, immune function, and overall health. Good sleep quality supports attention, learning, and mood stability throughout the day. Poor sleep quality, on the other hand, can lead to fatigue, irritability, reduced concentration, and an increased risk of physical and mental health problems.</p>
<p>Poor sleep quality can include difficulty falling or staying asleep, non-restorative sleep (sleep after which one does not feel refreshed), or frequent awakenings. Factors that can affect sleep quality include stress, anxiety, irregular sleep schedules, caffeine or alcohol use, medical conditions, and environmental disturbances like noise or light. Chronic poor sleep can impair immune function, raise the risk of cardiovascular disease, and contribute to depression or anxiety.</p>
<p>Study author Najwa Basis and her colleagues aimed to test the Social-Ecological Model of Sleep Health, as well as a conceptual model proposing that religious involvement might influence sleep quality. The Social-Ecological Model of Sleep Health is a theoretical framework suggesting that sleep is shaped by multiple interacting influences, ranging from individual behaviors to broader social and cultural factors. The authors hypothesized that religiosity may influence anxiety and depression, which in turn affect sleep.</p>
<p>Participants included 233 individuals recruited from the Druze villages of Daliat El-Carmel and Usfiyya on Mount Carmel, Israel. The Druze in Israel are a unique religious minority who follow a monotheistic faith that evolved from Islam but is now distinct. They speak Arabic and are known for their loyalty to the country they live in. In Israel, they are the only Arab group subject to mandatory military conscription.</p>
<p>Participants reported whether they identified as religious (Uqqal) or non-religious (Juhal), how frequently they participated in religious events, and how often they visited religious venues. They also completed assessments of religious orientation (using the Religious Life and Orientation Scale), anxiety and depression (using the Hospital Anxiety and Depression Scale, or HADS), and filled out a two-week self-report sleep diary (the Consensus Sleep Diary).</p>
<p>Of the 233 participants, 93 identified as religious and 140 as non-religious. The average participant age was approximately 37 years. About 23% had borderline anxiety levels, and 12% had clinical levels of anxiety. Anxiety was significantly higher among non-religious participants. However, there were no significant differences in depression levels between the two groups.</p>
<p>Thirteen percent of participants reported poor sleep quality, while 39% reported good sleep quality. Religious participants tended to report significantly better sleep than their non-religious peers. The researchers tested a statistical model suggesting that religiosity reduces anxiety, which in turn improves sleep quality. Analyses supported the possibility of this relationship.</p>
<p>“Our results suggest that mental health factors, particularly anxiety, may play a significant role in the relationship between religiosity, religious orientation, and sleep health. The findings of this research provide crucial insights into the Druze community in Israel and contribute to the existing literature by linking the deeper aspects of religiosity to sleep outcomes,” the study authors concluded.</p>
<p>The study contributes to the scientific understanding of the links between religiosity and mental health. However, it should be noted that participants of this study were exclusively Israeli Druze. Results on other religious and cultural groups may differ.</p>
<p>The paper, “<a href="https://doi.org/10.1111/jsr.70055">Religiosity, Religious Orientation, and a Good Night’s Sleep: The Role of Anxiety and Depression,</a>” was authored by Najwa Basis, Lital Keinan Boker, and Tamar Shochat.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-common-vegetable-may-counteract-brain-changes-linked-to-obesity/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A common vegetable may counteract brain changes linked to obesity</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 11th 2025, 09:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://www.sciencedirect.com/science/article/abs/pii/S0006899325002008" target="_blank">Brain Research</a></em> suggests that adding okra to the diet can protect against long-term metabolic problems caused by early-life overfeeding. In a rat model, okra supplementation prevented obesity, high blood sugar, and brain inflammation associated with being overnourished during infancy.</p>
<p>Okra, also known as <em>Abelmoschus esculentus</em>, is a vegetable rich in antioxidants, fiber, and plant compounds believed to have anti-inflammatory and metabolic benefits. While it’s often consumed for its nutritional value, researchers are increasingly interested in its potential to reduce the risk of chronic diseases. The authors of this new study set out to test whether including okra in the diet could protect against metabolic disorders that begin during early development.</p>
<p>The rationale behind the study stems from concerns about early overnutrition, which has been linked to long-term health problems. When young animals or humans consume more calories than needed during critical periods of development, they can experience lasting changes to their metabolism, body composition, and brain signaling pathways involved in hunger and energy use. One known consequence of early overfeeding is the development of obesity and insulin resistance in adulthood. The researchers aimed to explore whether supplementing the diet with okra could prevent these outcomes by improving inflammation and restoring healthy energy balance in the brain and body.</p>
<p>To test this idea, the researchers used a well-established model of early overfeeding in rats. Shortly after birth, some litters were reduced to just three pups per mother, ensuring that each infant rat had more access to milk and gained weight more rapidly. This small litter (SL) condition is known to simulate early overnutrition. A separate group of litters was maintained at the typical eight pups per mother, serving as the normal litter (NL) control group.</p>
<p>At three weeks of age, all rats were weaned and assigned to one of two diets: a standard rodent diet or the same diet supplemented with 1.5% okra. This produced four groups in total: normal-litter rats on a standard diet (NL-SD), normal-litter rats on the okra diet (NL-AE), small-litter rats on a standard diet (SL-SD), and small-litter rats on the okra diet (SL-AE). The animals continued on these diets until they reached adulthood at 100 days of age.</p>
<p>Throughout the study, the researchers tracked body weight, food and water consumption, blood sugar levels, fat accumulation, and muscle mass. They also measured insulin sensitivity both in the body and the brain, and examined levels of inflammation-related molecules in the hypothalamus, a brain region that regulates appetite and energy balance.</p>
<p>As expected, the small-litter rats on the standard diet became obese and showed multiple signs of metabolic dysfunction. These rats ate more food, had higher blood sugar and fat levels, and gained more fat mass compared to their normal-litter peers. They also displayed impaired glucose tolerance and signs of insulin resistance. In addition, these animals had elevated levels of inflammatory molecules—such as tumor necrosis factor alpha, interleukin-6, and interleukin-1 beta—in the hypothalamus. These molecules are known to interfere with insulin signaling and promote weight gain and metabolic disease.</p>
<p>The rats that were overfed early in life but received the okra-supplemented diet showed a very different profile. Despite having the same early nutritional background, they avoided the rise in blood sugar, triglycerides, and cholesterol seen in their standard-fed counterparts. Their total fat mass was reduced, and their muscle mass improved. They also had better results on a glucose tolerance test, suggesting improved control of blood sugar. </p>
<p>Importantly, their brains showed lower levels of inflammatory markers, and they responded to insulin administered directly into the brain—something the overfed rats on a standard diet could not do. This improvement in brain insulin sensitivity was linked to reduced food intake.</p>
<p>The rats in the normal-litter groups, whether they received the okra diet or not, showed no significant differences in weight, fat, glucose, or inflammation, suggesting that okra did not produce major changes in healthy animals, but specifically helped those who were metabolically impaired.</p>
<p>The researchers believe that compounds found in okra—such as catechins, quercetin, and other phenolics—may help explain the observed benefits. These plant-based substances are known to have antioxidant and anti-inflammatory properties, and previous research has suggested they can improve insulin signaling and reduce blood sugar levels. Although the exact mechanisms remain unclear, the results support the idea that functional foods like okra can influence long-term health when introduced early in life.</p>
<p>This study adds to growing evidence that diet in early life has lasting consequences on health and that certain foods may offer protective effects. The findings are especially relevant given the increasing rates of childhood obesity worldwide and the difficulty of reversing its effects in adulthood. While this experiment was conducted in rats and may not directly translate to humans, it raises the possibility that early dietary interventions with plant-based compounds could help prevent or manage metabolic disorders.</p>
<p>However, the authors acknowledge some limitations. They did not examine insulin production or hormone levels in the pancreas, which would have provided more information about how okra affects insulin regulation. They also did not study whether okra directly influences leptin signaling, another important regulator of energy balance. More research is needed to clarify how specific compounds in okra interact with metabolic pathways and whether similar effects can be observed in human populations.</p>
<p>Nevertheless, the study provides evidence for the potential of okra as a dietary supplement to reduce the long-term harm of early overnutrition. By improving both peripheral and brain insulin sensitivity, reducing inflammation, and restoring healthier body composition, okra may serve as a useful component of non-drug strategies to combat obesity and its associated health risks. Future studies in humans will be needed to determine whether similar effects can be achieved and whether okra-based interventions could become part of broader public health approaches to improve lifelong metabolic health.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.brainres.2025.149641" target="_blank">Okra-supplemented diet prevents hypothalamic inflammation in early overfeeding-programmed obese rats</a>,” was authored by Camila Luiza Rodrigues dos Santos Ricken, Ginislene Dias, Ingridys Regina Borkenhagen, Adriano Nicoli Roecker, Gisele Facholi Bomfim, Hercules de Oliveira Costermani, Aline Milena Dantas Rodrigues, Nathalia Macedo Sanches, Ester Vieira Alves, Ricardo de Oliveira, and Júlio Cezar de Oliveira.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/massive-psychology-study-reveals-disturbing-truths-about-machiavellian-leaders/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Massive psychology study reveals disturbing truths about Machiavellian leaders</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 11th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in the <em><a href="https://doi.org/10.1002/job.2877" target="_blank">Journal of Organizational Behavior</a></em> sheds light on the ongoing debate over whether Machiavellian tactics help or hurt leaders in the workplace. Analyzing data from over half a million participants, researchers found that Machiavellian leaders tend to create toxic environments for followers, yet are not consistently rewarded or penalized for their behavior. The results suggest that these leaders are often seen as abusive and manipulative, but their success or failure depends heavily on context. </p>
<p>People high in the personality trait of Machiavellianism tend to believe that others are malicious or incompetent, which leads them to use deceit and exploitation to achieve their goals. Some argue that Machiavellianism is fundamentally harmful for leadership. Others suggest that it offers a strategic advantage, especially in competitive or high-stakes environments. The new research attempts to move beyond this binary by offering a more nuanced theoretical model that considers when and how Machiavellian leaders succeed or fail.</p>
<p>“Much discourse on Machiavellian leadership presents one-sided perspectives,” explained study author Alexander R. Marbut, a PhD candidate at The University of Alabama. “In best practice books on leadership and among some economists, Machiavellians are presented as pragmatic realists who do what is necessary to ensure their organization or nation’s survival. In psychology and business ethics, they are presented as evil masterminds.” </p>
<p>“This is no surprise because they have been viewed throughout history as being one or the other. We strove to provide a balanced perspective that moves optics on the topic away from polarized, sensationalistic reasoning. Machiavellianism entails a question most leaders struggle to answer: Is it better to be feared than loved, given the vulnerability that comes with trying to be loved?”</p>
<p>To investigate this, Marbut and his colleagues conducted a meta-analysis of 163 independent samples, including data from 510,925 participants. They focused on 15 leadership-related outcomes, ranging from leadership style and effectiveness to follower performance and well-being. The researchers synthesized decades of research across psychology, economics, and organizational behavior, bringing together findings that had previously been scattered or siloed. The authors also considered differences between self-reports and evaluations from peers, subordinates, and supervisors, as well as the influence of social skill and tenure.</p>
<p>The results showed that leaders high in Machiavellianism tend to, on average, adopt leadership styles that are seen as inauthentic, inconsiderate, and morally nonconventional. They were also less likely to develop others or their careers or articulate clear expectations for followers. </p>
<p>Followers of Machiavellian leaders reported significantly worse experiences across multiple domains. These leaders were strongly associated with lower-quality relationships with followers, higher levels of abusive supervision, lower follower job satisfaction, and greater burnout. They were also associated with reduced citizenship behaviors—voluntary acts that benefit others or the organization—and increased deviant behaviors, such as rule-breaking or sabotage. In one of the most striking findings, Machiavellianism explained nearly half of the variance in perceived relationship quality and more than a quarter of the variance in abusive supervision .</p>
<p>“On average and as indexed by multiple metrics, Machiavellians are seen as abusive and without any intention to develop or proactively lead followers, and they may even stand in the way of performance by providing directions ambiguous enough that they can take credit for wins while blaming followers for failures,” Marbut said.</p>
<p>Interestingly, the study found that Machiavellians were more likely to see themselves as “unethical” and inconsiderate, suggesting that they do not pretend to follow traditional norms but instead reject them outright. Their moral outlook appears to be rooted in a belief that the world is inherently competitive and dangerous, and that survival depends on anticipating betrayal and exploiting opportunities before others do.</p>
<p>“Contrary to popular rhetoric, they openly admit to such tactics, as well as to rejecting traditional ethics,” Marbut explained.</p>
<p>But Machiavellian leaders were not reliably penalized. Findings as to whether they attain leadership status, are viewed as leader-like, or are seen as effective or high-performing leaders were inconsistent. This suggests that such leaders are neither consistently promoted nor sidelined, but rather operate in a gray zone where their outcomes depend on how they manage others’ perceptions and navigate organizational politics.</p>
<p>“Despite the above, they are no less likely than anyone else to be seen as high in leader potential, to be placed in leadership roles, or to be seen as effective leaders by their bosses and coworkers,” Marbut said. “Contrary to existing theorizing, their reputations remain stable over time. This is why I emphasize that their success depends on the group accepting their vision: faking traditional ethics would foster cultures contrary to the very beliefs they are convinced by.”</p>
<p>Context played a major role in shaping the outcomes associated with Machiavellian leaders. The researchers found that tenure moderated how these leaders were perceived. Contrary to some predictions, high-tenure Machiavellians tended to be seen as “ethical”, possibly because they managed to reshape the norms of their organizations or convince others of the legitimacy of their worldviews.</p>
<p>Social skill also mattered. Machiavellian leaders with low self-awareness tended to be seen as more “ethical”, perhaps because they avoided drawing attention to their more cynical, nontraditional beliefs. However, highly self-aware Machiavellians were more likely to be seen as “unethical”, perhaps due to feedback and conflict making those beliefs obvious to others.</p>
<p>Marbut explained that socioanalysts offer two kinds of guidance: advice for people as actors and as observers. As actors, individuals should understand that Machiavellianism is a dark trait present in everyone to varying degrees. “Engaging one’s own Machiavellianism is like playing with fire: for leaders, it carries the risk of leading followers to despise you with little to show for it in the way of economic gain.” If those around you begin to seem “traitorous, foolish, or lazy,” gather information before acting—you may be right, or “you might find that your mind was playing tricks on you,” Marbut said. “You might feel that it is safer to distrust out of caution, but bear in mind that if you are wrong you might gain a reputation as a defector or abuser, which can lead the collective to turn on you (this is Dawkins’s ‘conspiracy of the doves’: acting as a hawk to ‘play it safe’ is only safe until the doves collectively swarm on you).”</p>
<p>As observers, Marbut continued, “people should be aware of symptoms of Machiavellianism in others and view them as ‘yellow flags.’ Look for consistent signs, not isolated incidents. The first mark to look for is cynicism: the raised eyebrow and sardonic grin; putting others down as though doing so makes them look better; hyperactive risk radars.</p>
<p>“The second mark to look out for is criticism of traditional virtues, although it may often be couched as pragmatism: either way, actions such as kindness, honesty, and humility are framed as being ‘weak’ because ‘it is a dog eat dog world,'” Marbut continued. “While some Machiavellians are astute at helping others take off their rose colored glasses, the worst among them are prone to cruelty and cowardice, from selling secrets to competitors to ruining intergroup relations because they can’t bring themselves to trust others. Our results show that peers and supervisors tend to not be aware of how vicious worst cases can be but that they make followers’ lives miserable: subordinate interviews may be notably helpful in separating the grain from the chaff, but only if anonymity is absolutely guaranteed.”</p>
<p>“If you are working under a ‘bad’ Machiavellian, get out as soon as possible: they will ruin your career and well-being. Until you can get out, keep all communication formal and work-related. Do not try to be friends with them, as they will see it as insincere and trust you less. You will never gain their approval: the most you can hope for is less disapproval.”</p>
<p>“It may help if you base anything you say to them on facts, data, and logic,” Marbut advised. “If they think for a moment that you are speaking idealistically, they will tune out anything else you say: they view thinking about how the world should be as unrealistic, so only talk to them about what is and what can be done about it. Lastly, do not ever let them think that you lack devotion: they will assume that you are lazy and selfish, so you do not want to give them ‘proof’ of either.”</p>
<p>The findings also challenge the usefulness of oversimplified labels. The researchers argue that reducing Machiavellianism to a single behavior like manipulation misses the broader psychological and behavioral patterns that define the trait. They advocate for a more comprehensive view that incorporates cynical thinking, grim and egotistic emotions, and a tendency toward aggression.</p>
<p>In this view, Machiavellians are not just amoral deceivers — a trait they share with psychopathic individuals — they have a coherent worldview that prioritizes self-preservation.</p>
<p>“A second motive for this project was to conduct a thorough review of what Machiavellianism is due to arguments that it is the same as psychopathy,” Marbut said. “The result was the conclusion that the worst Machiavellians and psychopaths pose many of the same risks: psychologically damaging abuse and criminal activity. But the former is risk-averse and unrewarding to be around, and the latter is risk-prone and fun to be around (when they want to be). Of note, my advice for dealing with Machiavellian bosses is known to backfire on psychopaths, which is why I emphasize the need to be mindful of the type of social manipulator you are dealing with.”</p>
<p>“If you are actually interacting with Machiavellians, you are much more likely to confuse them with paranoid or obsessive folks who likewise walk around with dark clouds over their heads, but they do so for different reasons and so pose different benefits and risks to their groups. This is why I argue that organizational scientists and practitioners should be talking about the broader spectrum of dark traits. If we don’t know how to tell them apart, how can we have any hope of dealing with them in ourselves or others?”</p>
<p>As with all research, there are some caveats to consider. Although the meta-analysis included a large and diverse set of data, it relied heavily on subjective measures, which can introduce bias. The findings also may not apply equally across all industries, cultures, or levels of leadership. The authors note that more research is needed to understand how Machiavellianism functions in specific organizational contexts, especially in competitive fields where strategic cunning may be more accepted or rewarded.</p>
<p>“Most of the effects reported were highly conditional, and so the correlations noted above should be read in terms of Machiavellian leaders’ outcomes ‘on average,'” Marbut told PsyPost. “In some contexts, some Machiavellians have good relationships with their subordinates or help financial performance. This is why I emphasized for actors the phrase ‘playing with fire’ and for observers the need to ‘separate the grain from the chaff.'”</p>
<p>“It is not surprising that the effects of Machiavellianism, or any dark trait, would be conditional: dark traits exist for a reason, and so some “positive” outcomes should be expected. At the same time, people tend to not be aware of their own or others’ dark traits, and so it is no surprise that the average person would be poor at managing their own Machiavellianism or detecting unhealthy Machiavellianism in others.”</p>
<p>Looking ahead, Marbut plans to explore how to help organizations better understand, manage, and reduce the risks associated with dark personality traits.</p>
<p>“I have two research trajectories, both focused on dark traits generally but for now on Machiavellianism and psychopathy as those are where my expertise runs deepest,” Marbut explained. “The first is human resources oriented, focused on understanding how organizations can monitor levels of Machiavellianism and psychopathy in their workforces and alter their policies to avoid the scandals both traits are known for without missing out on the benefits of both traits that only experts on them talk about.” </p>
<p>“The second is people oriented, focused on how employees can be trained to recognize and manage dark traits in themselves and others. I am conducting a study now on how leaders can manage Machiavellian and psychopathic followers, and in particular whether tactics that work on one risk backfiring on the other.”</p>
<p>The study, “<a href="https://doi.org/10.1002/job.2877" target="_blank">In the Service of the Prince: A Meta-Analytic Review of Machiavellian Leadership</a>,” was authored by A. R. Marbut, P. D. Harms, and M. Credé.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/dementia-your-lifetime-risk-may-be-far-greater-than-previously-thought/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Dementia: Your lifetime risk may be far greater than previously thought</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 10th 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>The probability of any American having dementia in their lifetime may be far greater than previously thought. For instance, a 2025 study that tracked a large sample of American adults across more than three decades found that their average likelihood of <a href="https://doi.org/10.1038/s41591-024-03340-9">developing dementia between ages 55 to 95 was 42%</a>, and that figure was even higher among women, Black adults and those with genetic risk.</p>
<p>Now, a great deal of attention is being paid to <a href="https://www.nia.nih.gov/health/brain-health/cognitive-health-and-older-adults">how to stave off cognitive decline</a> in the aging American population. But what is often missing from this conversation is <a href="https://www.psypost.org/scientists-uncover-an-unsettling-effect-of-chronic-social-stress-on-brain-cells/">the role that chronic stress</a> can play in how well people age from a cognitive standpoint, as well as everybody’s risk for dementia.</p>
<p>We are professors at Penn State in the <a href="https://healthyaging.psu.edu/">Center for Healthy Aging</a>, with expertise in <a href="https://scholar.google.com/citations?hl=en&user=wnP42OIAAAAJ&view_op=list_works">health psychology</a> and <a href="https://scholar.google.com/citations?hl=en&user=00f7MKUAAAAJ&view_op=list_works">neuropsychology</a>. We study the pathways by which chronic psychological stress influences the risk of dementia and how it influences the ability to stay healthy as people age.</p>
<p>Recent research shows that Americans who are currently middle-aged or older report experiencing <a href="https://doi.org/10.1037/amp0000597">more frequent stressful events</a> than previous generations. A key driver behind this increase appears to be rising economic and job insecurity, especially in the wake of the 2007-2009 <a href="https://doi.org/10.1016/j.jvb.2018.05.001">Great Recession</a> and ongoing shifts in the labor market. Many people stay in the workforce longer due to financial necessity, as Americans are living longer and face <a href="https://www.ncbi.nlm.nih.gov/books/NBK588545/">greater challenges covering basic expenses in later life</a>.</p>
<p>Therefore, it may be more important than ever to understand the pathways by which stress influences cognitive aging.</p>
<h2>Social isolation and stress</h2>
<p>Although everyone experiences some stress in daily life, some people experience stress that is more intense, persistent or prolonged. It is this <a href="https://doi.org/10.1111/spc3.12020">relatively chronic stress</a> that is most consistently linked with poorer health.</p>
<p>In a recent review paper, our team summarized how chronic stress is a hidden but <a href="https://doi.org/10.1177/23727322241303761">powerful factor underlying cognitive aging</a>, or the speed at which your cognitive performance slows down with age.</p>
<p>It is hard to overstate the impact of stress on your cognitive health as you age. This is in part because your psychological, behavioral and biological responses to everyday stressful events <a href="https://doi.org/10.1016/j.yfrne.2018.03.001">are closely intertwined</a>, and each can amplify and interact with the other.</p>
<p>For instance, living alone can be stressful – <a href="http://doi.org/10.1016/j.archger.2019.02.007">particularly for older adults</a> – and being isolated makes it <a href="https://doi.org/10.3233/adr-230011">more difficult to live a healthy lifestyle</a>, as well as to <a href="https://doi.org/10.1093/geront/gnac032">detect and get help for signs of cognitive decline</a>.</p>
<p>Moreover, stressful experiences – and your reactions to them – can <a href="https://doi.org/10.1037/pag0000704">make it harder to sleep well</a> and to engage in other healthy behaviors, <a href="https://doi.org/10.1007/s40279-013-0090-5">like getting enough exercise</a> and <a href="https://doi.org/10.1080/17437199.2021.1923406">maintaining a healthy diet</a>. In turn, insufficient sleep and a <a href="https://doi.org/10.1037/hea0000532">lack of physical activity</a> can make it <a href="https://doi.org/10.1037/hea0001033">harder to cope with stressful experiences</a>.</p>
<h2>Stress is often missing from dementia prevention efforts</h2>
<p>A robust body of research <a href="https://theconversation.com/dementia-risk-factors-identified-in-global-report-are-all-preventable-addressing-them-could-reduce-dementia-rates-by-45-236290">highlights the importance</a> of <a href="https://doi.org/10.1016/S0140-6736(24)01296-0">at least 14 different factors</a> that relate to your <a href="https://doi.org/10.1002/alz.13809">risk of Alzheimer’s disease</a>, a common and devastating form of dementia and other forms of dementia. Although some of these factors may be outside of your control, such as diabetes or depression, many of these factors involve things that people do, such as physical activity, healthy eating and social engagement.</p>
<p>What is less well-recognized is that chronic stress is intimately <a href="https://doi.org/10.1177/23727322241303761">interwoven with all of these factors</a> that relate to dementia risk. Our work and research by others that we reviewed in our recent paper demonstrate that chronic stress can affect brain function and physiology, influence mood and make it harder to maintain healthy habits. Yet, dementia prevention efforts rarely address stress.</p>
<p>Avoiding stressful events and difficult life circumstances is typically not an option.</p>
<p>Where and how you live and work plays a major role in how much stress you experience. For example, people with lower incomes, less education or <a href="https://theconversation.com/poor-neighborhoods-health-care-barriers-are-factors-for-heart-disease-risk-in-black-mothers-250591">those living in disadvantaged neighborhoods</a> often face more frequent stress and have fewer forms of support – such as nearby clinics, access to healthy food, reliable transportation or safe places to exercise or socialize – to help them manage the challenges of aging. As shown in recent work on <a href="https://doi.org/10.1007/s13670-023-00400-9">brain health in rural and underserved communities</a>, these conditions can shape whether people have the chance to stay healthy as they age.</p>
<p>Over time, the effects of stress tend to <a href="https://doi.org/10.3389/fsci.2024.1389481">build up</a>, wearing down the body’s systems and shaping long-term emotional and social habits.</p>
<h2>Lifestyle changes to manage stress and lessen dementia risk</h2>
<p>The good news is that there are multiple things that can be done to slow or prevent dementia, and our review suggests that these can be enhanced if the role of stress is better understood.</p>
<p>Whether you are a young, midlife or an older adult, it is not too early or too late to address the implications of stress on brain health and aging. Here are a few ways you can take direct actions to help manage your level of stress:</p>
<ul>
<li>Follow lifestyle behaviors that can <a href="https://order.nia.nih.gov/sites/default/files/2025-02/reducing-your-risk-dementia-tip-sheet.pdf">improve healthy aging</a>. These include: <a href="https://doi.org/10.1016/j.metop.2023.100257">following a</a> <a href="https://www.nia.nih.gov/health/alzheimers-and-dementia/what-do-we-know-about-diet-and-prevention-alzheimers-disease">healthy diet</a>, engaging in <a href="https://order.nia.nih.gov/sites/default/files/2025-02/reducing-your-risk-dementia-tip-sheet.pdf">physical activity</a> and getting <a href="https://www.nia.nih.gov/health/sleep/sleep-and-older-adults">enough sleep</a>. Even small changes in these domains can make a big difference.</li>
<li>Prioritize your mental health and well-being to the extent you can. Things as simple as talking about your worries, asking for support from friends and family and going outside regularly can be immensely valuable.</li>
<li>If your doctor says that you or someone you care about should follow a new health care regimen, or suggests there are signs of cognitive impairment, ask them what support or advice they have for managing related stress.</li>
<li>If you or a loved one feel socially isolated, consider how small shifts could make a difference. For instance, research suggests that adding <a href="https://doi.org/10.1093/geronb/gbae134">just one extra interaction a day</a> – even if it’s <a href="https://doi.org/10.1093/geronb/gbac083">a text message or a brief phone call</a> – can be helpful, and that even interactions with people you don’t know well, such as at a coffee shop or doctor’s office, can have meaningful benefits.</li>
</ul>
<h2>Walkable neighborhoods, lifelong learning</h2>
<p>A 2025 study identified stress as one of 17 overlapping factors that affect the <a href="https://doi.org/10.1136/jnnp-2024-334925">odds of developing any brain disease</a>, including stroke, late-life depression and dementia. This work suggests that addressing stress and overlapping issues such as loneliness may have additional health benefits as well.</p>
<p>However, not all individuals or families are able to make big changes on their own. Research suggests that community-level and workplace interventions can reduce the risk of dementia. For example, <a href="https://doi.org/10.1007/s00420-023-01973-w">safe and walkable neighborhoods</a> and opportunities for social connection and lifelong learning – such as through community classes and events – have the potential to <a href="https://doi.org/10.1016/j.pec.2024.108254">reduce stress and promote brain health</a>.</p>
<p>Importantly, researchers have estimated that even a modest delay in disease onset of Alzheimer’s would save <a href="https://doi.org/10.1515/fhep-2014-0013">hundreds of thousands of dollars for every American</a> affected. Thus, providing incentives to companies who offer stress management resources could ultimately save money as well as help people age more healthfully.</p>
<p>In addition, stress related to the <a href="https://doi.org/10.3233/JAD-200932">stigma around mental health and aging</a> can discourage people from seeking support that would benefit them. Even just thinking about your risk of dementia can be stressful in itself. Things can be done about this, too. For instance, <a href="https://doi.org/10.1016/S0140-6736(23)01406-X">normalizing the use of hearing aids</a> and <a href="https://doi.org/10.1186/s12877-023-04053-3">integrating reports of perceived memory and mental health issues</a> into routine primary care and workplace wellness programs could encourage people to engage with preventive services earlier.</p>
<p>Although research on potential biomedical treatments is ongoing and important, there is currently no cure for Alzheimer’s disease. However, if interventions aimed at reducing stress were prioritized in <a href="https://link.springer.com/article/10.1007/s11121-022-01385-1">guidelines for dementia prevention</a>, the benefits could be far-reaching, resulting in both delayed disease onset and improved quality of life for millions of people.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/250583/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/chronic-stress-contributes-to-cognitive-decline-and-dementia-risk-2-healthy-aging-experts-explain-what-you-can-do-about-it-250583">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/psychopathic-tendencies-may-be-associated-with-specific-hormonal-patterns/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Psychopathic tendencies may be associated with specific hormonal patterns</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 10th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://psycnet.apa.org/record/2026-05252-001" target="_blank">Psychology & Neuroscience</a></em> suggests that hormone levels in the body may be connected to psychopathic personality traits, particularly during adolescence and early adulthood. The research team conducted a systematic review and meta-analysis of studies examining endocrine system hormones like cortisol, testosterone, and oxytocin, uncovering a number of small but statistically significant associations. The strongest finding was a positive relationship between psychopathy and baseline cortisol levels.</p>
<p>Psychopathy is a personality construct associated with traits like emotional coldness, lack of empathy or guilt, impulsive behavior, and manipulativeness. Psychopathic traits can emerge early in life and, in some individuals, become more pronounced over time. These traits are typically divided into two dimensions. The first dimension relates to interpersonal and emotional features such as charm, callousness, and a lack of remorse. The second includes more behavioral elements like impulsivity and antisocial tendencies.</p>
<p>Researchers have long debated the biological basis of psychopathy. One theory, called the Low-Fear Hypothesis, suggests that people with psychopathic traits are less responsive to fear and stress, which might explain their inability to conform to social rules or respond emotionally to the suffering of others. This has led scientists to look at the body’s stress response systems—including hormone regulation—as a possible source of insight.</p>
<p>The new study aimed to bring clarity to the growing but inconsistent literature on this topic. Led by researchers from Portugal, the team reviewed 26 empirical studies published between 1998 and 2023. These studies explored how psychopathy might be related to specific hormones produced by the endocrine system, including cortisol (a hormone involved in stress), testosterone (related to aggression and dominance), dehydroepiandrosterone or DHEA (another stress-related hormone), estradiol (a form of estrogen), and oxytocin (which affects social bonding and empathy).</p>
<p>The researchers also performed a meta-analysis, which allows scientists to combine statistical results across multiple studies to get a more reliable estimate of an overall effect. They focused this analysis specifically on the relationship between basal (or resting) cortisol levels and psychopathy. Seven studies provided enough data to be included in this part of the research.</p>
<p>The meta-analysis found a statistically significant relationship between overall psychopathy and higher basal cortisol levels. A closer look at the different dimensions of psychopathy showed that this association was driven primarily by the second dimension—the one related to impulsive and antisocial behavior. No significant relationship was found between baseline cortisol and the first dimension of psychopathy, which involves emotional detachment and callousness.</p>
<p>These findings are notable because they partially contradict expectations from previous theories. Under the Low-Fear Hypothesis, one might expect individuals with psychopathic traits to show lower cortisol levels in general, reflecting a weaker stress response. Yet, this review found the opposite in many cases. People with higher psychopathy scores—particularly in the behavioral domain—tended to have higher resting cortisol levels.</p>
<p>Still, the results were far from uniform across studies. Some papers reported lower cortisol levels in individuals with high psychopathy or callous-unemotional traits, especially in young males. Others found no significant associations at all. Some studies observed differences in cortisol reactivity—how cortisol levels change in response to stress—rather than in resting levels. For example, in one study, individuals with psychopathic traits showed smaller changes in cortisol levels when exposed to a stressor.</p>
<p>In addition to cortisol, the researchers examined evidence linking psychopathy to other hormones. Several studies suggested that testosterone may be positively related to psychopathic traits, especially the behavioral dimension. For instance, higher testosterone levels were linked to increased impulsivity, aggression, and a reduced sensitivity to social punishment. However, not all studies supported this connection, and some found no meaningful relationship between testosterone and psychopathy at all.</p>
<p>The review also found mixed evidence regarding DHEA, a hormone that is often produced in response to stress. In some studies, higher DHEA levels were associated with more pronounced psychopathic traits, especially in adolescents. Other studies, however, found no such relationship. Similar inconsistency was seen in studies examining estradiol and oxytocin. Oxytocin, which plays a role in empathy and social bonding, was often lower in people with high levels of callous-unemotional traits, but again, not consistently across all studies.</p>
<p>The authors noted that differences in sample characteristics may have contributed to the variability in findings. Many studies were conducted in forensic settings using male participants, while others used community samples that included women. Psychopathy was measured in a variety of ways, including self-report questionnaires and clinician-administered interviews, which can yield different results. Hormone levels were also assessed using different protocols, and the timing of hormone collection can significantly influence the results—particularly for cortisol, which fluctuates throughout the day.</p>
<p>Given these inconsistencies, the researchers emphasized the importance of considering a range of contextual and biological factors in future work. These include age, sex, pubertal status, and co-occurring behavioral problems like conduct disorder. They also pointed out the need for more research into hormones that were underrepresented in the current review, such as adrenaline and noradrenaline. These hormones play important roles in the body’s acute stress response but were excluded from the analysis due to poor study quality.</p>
<p>Another limitation is that only seven of the 26 studies were suitable for inclusion in the meta-analysis. This small number limited the researchers’ ability to test whether factors like age or sex might change the strength of the relationship between cortisol and psychopathy. It also made it difficult to assess publication bias and study heterogeneity, which can influence the reliability of a meta-analysis.</p>
<p>Despite these limitations, the study offers an important step toward understanding psychopathy through the lens of neuroendocrinology. By identifying hormonal signatures that may be linked to specific dimensions of psychopathy, researchers hope to improve early detection and intervention efforts—particularly for adolescents who may be at risk of developing more severe forms of the disorder later in life.</p>
<p>The findings also raise broader questions about the role of biology in shaping personality and behavior. While hormones alone cannot explain psychopathy, they may interact with social and psychological factors in complex ways. Integrating hormonal data into existing psychological theories of psychopathy could offer a more complete picture of how these traits develop and persist over time.</p>
<p>In the long term, this line of research could inform more personalized treatment approaches. If certain hormonal patterns are found to be consistently associated with specific psychopathic traits, clinicians might one day be able to use hormone testing to support diagnosis, monitor treatment progress, or even develop biologically informed interventions. However, the study’s authors caution that such applications remain speculative for now and must be grounded in rigorous, future research.</p>
<p>The study, “<a href="https://psycnet.apa.org/doi/10.1037/pne0000359" target="_blank">Psychopathy and Hormonal Biomarkers: A Systematic Review and Meta-Analysis</a>,” was authored by Catarina Braz Ferreira, Patrícia Figueiredo, Eduarda Ramião, Sofia Silva, and Ricardo Barroso.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-use-deep-learning-to-uncover-hidden-motor-signs-of-neurodivergence/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists use deep learning to uncover hidden motor signs of neurodivergence</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 10th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://www.nature.com/articles/s41598-025-04294-9" target="_blank">Scientific Reports</a></em> has introduced a promising diagnostic tool that could dramatically shorten the long wait times many families face when seeking evaluations for autism and attention-related conditions. The research team used artificial intelligence to analyze subtle patterns in how people move their hands during simple tasks, identifying with surprising accuracy whether someone is likely to have autism, attention-deficit traits, or both. The method, which relies on wearable motion sensors and deep learning, could one day serve as a rapid and objective screening tool to help clinicians triage children for further assessment.</p>
<p>Autism and attention-deficit disorders are both classified as neurodevelopmental conditions, meaning they affect how the brain grows and functions. Autism spectrum conditions are typically identified by challenges in social interaction, communication differences, and repetitive behaviors. Attention-deficit disorders, which may include hyperactivity, are marked by difficulties with focus, impulsivity, and sustained attention.</p>
<p>While these are distinct diagnoses, they often co-occur in the same individual. In fact, up to 70 percent of people diagnosed with autism also show signs of attention-related difficulties. Despite their prevalence, diagnosing these conditions remains a long and complex process, often involving interviews, questionnaires, and behavioral observations that can take months or even years to schedule.</p>
<p>The researchers, led by <a href="https://medicine.iu.edu/faculty/25528/jose-jorge" target="_blank">Jorge José</a>, sought to develop a new way to detect neurodevelopmental differences using objective data. They focused on a very basic behavior: the motion of reaching for something. Although such movement may seem simple, <a href="https://www.psypost.org/new-study-ai-can-identify-autism-from-tiny-hand-motion-patterns/" target="_blank">research has shown</a> that subtle patterns in how someone moves their body can reveal a great deal about how their brain processes information.</p>
<p>Prior studies have suggested that children with autism or attention problems often show differences in motor planning and coordination, even during early development. These insights led the researchers to ask whether high-resolution motion tracking, paired with machine learning, could detect patterns unique to different neurodevelopmental conditions.</p>
<p>“I was trained as a theoretical physicist. But during the last 20 years I have been working in quantitative neuroscience. I now have a lab. We started by studying the visual cortex in non-human primates. They performed the simple process of reaching for targets appearing randomly on a computer touch screen. This is the same protocol we have used in our recent paper,” explained José, the James H. Rudy Distinguished Professor of Physics at Indiana University.</p>
<p>“In <a href="https://doi.org/10.3389/fnint.2013.00032" target="_blank">our 2013 autism paper</a>, we found that by looking at natural movements at millisecond time scales, away from naked-eye detection, each participant’s movements were totally different, with autism being more random than neurotypical. This led us later in 2018, <a href="https://www.nature.com/articles/s41598-017-18902-w" target="_blank">in another paper</a>, to study in more detail the characteristics of ASD participants, uncovering a quantitative biomarker that correlates directly with the qualitative diagnostic tools used by providers.”</p>
<p>“In our present paper, we have extended the participant pool to include those with ASD, ADHD, comorbid ADHD+ASD, and neurotypical controls,” José said. “Importantly, we developed deep learning techniques that allow for the diagnosis of new participants with high accuracy and in just a few minutes. By looking at their movements at such fine time scales, we can also assess the degree of severity quantitatively.”</p>
<p>For their new study, the researchers recruited 92 participants, including individuals with autism, attention-deficit traits, both conditions, and a comparison group with no diagnosis. The participants ranged in age from children to young adults. Seventeen additional individuals were excluded from the final analysis due to problems completing the task, motor impairments unrelated to the conditions under study, or technical issues with the sensors. </p>
<p>All participants were able to perform a basic reaching task that required them to touch a target on a screen, withdraw their hand, and repeat the motion about 100 times. The motion of the dominant hand was tracked using a high-definition sensor placed in a glove, which recorded data at millisecond resolution.</p>
<p>The study used two complementary approaches to analyze the data. First, the researchers applied a deep learning technique to the raw motion data. The artificial neural network, trained on thousands of movement trials, learned to classify each participant as either autistic, having attention-deficit traits, both, or neither. The network architecture relied on a type of model well-suited for time-based sequences, called Long Short-Term Memory cells, which can capture both short and long-range patterns in data. </p>
<p>The researchers carefully trained and validated the model using cross-validation and tested its performance on data it had not seen before. When multiple types of movement data were combined—such as the hand’s angle, speed, and acceleration—the model achieved diagnostic accuracy around 70 percent.</p>
<p>José was surprised by “the fact that deep learning could detect inherent cognitive information properties about humans with high accuracy.”</p>
<p>In addition to classification accuracy, the researchers looked at another standard machine learning measure known as the Area Under the Receiver Operating Characteristic Curve, or AUC. This metric assesses how well a model can distinguish between categories. The model performed particularly well in identifying neurotypical individuals, with AUC scores as high as 0.95. Distinguishing between autism and attention-deficit traits proved more challenging, especially for individuals with both conditions, which is a known difficulty even in clinical settings.</p>
<p>To better understand the underlying movement differences, the team also conducted a second analysis focused on the statistical properties of the motion data. After filtering out electronic noise from the sensor signals, they measured how much randomness or variability was present in each person’s movement. </p>
<p>Two key statistical tools were used: the Fano Factor, which assesses variability relative to the mean, and Shannon Entropy, which measures the unpredictability in a signal. These measures provided a numerical fingerprint of each participant’s movement style, and higher levels of randomness tended to align with greater symptom severity as rated by clinicians.</p>
<p>The researchers found that individuals diagnosed with both autism and attention-deficit traits often showed intermediate levels of movement variability, overlapping with both neurotypical participants and those with only one diagnosis. Those with more severe autism symptoms had the most distinctive motion patterns, which matched well with previous research linking motor differences to the condition. </p>
<p>Interestingly, the biometrics were stable across the session, meaning that just 30 to 60 trials were often enough to get reliable measurements. This suggests that the method could be feasible for real-world screening, without requiring extensive testing.</p>
<p>The new study provides evidence “that the inherently random nature of human movements, when looked at times imperceptible to human eyes, contain important information about their cognitive abilities,” José told PsyPost.</p>
<p>While the study’s results are promising, the researchers acknowledge some limitations. The sample size, though larger than many previous studies of its kind, was still relatively small. In particular, the groups with attention-deficit traits alone were not as well represented, which could influence how well the model generalizes to broader populations. </p>
<p>“These are preliminary results that need to further validated in much larger groups of neurodivergent participants,” José noted.</p>
<p>The study also did not control for whether participants were taking medications, such as stimulants or other treatments, which could affect movement. Additionally, deep learning models can be difficult to interpret, and while the researchers took steps to visualize the importance of different input features, the decision-making process of the algorithm remains something of a black box.</p>
<p>Despite these limitations, the study offers a promising proof-of-concept for a new direction in mental health diagnostics. By using affordable, wearable sensors and machine learning, clinicians may one day be able to get early indications of neurodevelopmental differences without waiting for a full psychiatric evaluation. </p>
<p>Such tools could be especially useful in rural areas or regions with limited access to specialists. The researchers are hopeful that with larger datasets, their system can be refined and extended to track changes over time, such as how motion patterns respond to treatment or development.</p>
<p>Looking ahead, the team plans to expand the approach to study how these motion-based metrics evolve with age or medication use. The ultimate goal is not to replace traditional diagnosis, but to provide clinicians with an additional tool to support early detection and personalized care.</p>
<p>“We hope that our protocols can become another tool that can used by providers to have early assessments about the condition of neurodivergent participants which is in great need today,” José said.</p>
<p>The study, “<a href="https://www.nature.com/articles/s41598-025-04294-9" target="_blank">Deep learning diagnosis plus kinematic severity assessments of neurodivergent disorders</a>,” was authored by Khoshrav P. Doctor, Chaundy McKeever, Di Wu, Aditya Phadnis, Martin H. Plawecki, John I. Nurnberger Jr., and Jorge V. José.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/study-finds-anxious-mondays-linked-to-long-term-stress-and-heart-health-risks-in-older-adults/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Study finds “Anxious Mondays” linked to long-term stress and heart health risks in older adults</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jul 10th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research has uncovered a connection between how older adults feel on Mondays and their long-term biological stress levels. The study found that people aged 50 and over who reported feeling anxious on a Monday had significantly higher levels of stress-related hormones in their bodies up to two months later. This effect was not limited to those still in the workforce, suggesting that Monday-related anxiety may be deeply ingrained and could contribute to long-term health problems, including cardiovascular disease.</p>
<p>The study was led by <a href="https://sociology.hku.hk/people/tarani-chandola" target="_blank">Tarani Chandola</a> from the University of Hong Kong and published in the <em><a href="https://www.sciencedirect.com/science/article/pii/S0165032725010535" target="_blank">Journal of Affective Disorders</a></em>. It was the first of its kind to show that the psychological experience of Monday anxiety has a measurable effect on the body’s stress regulation system over time. These findings offer insight into why heart attacks and other cardiovascular events are more common on Mondays, a phenomenon that has puzzled scientists for decades.</p>
<p>The research team set out to explore why Mondays appear to be linked to worse health outcomes. Previous studies have shown a rise in heart attacks, strokes, and even suicides at the beginning of the week. Some researchers have proposed that the transition from the weekend to the start of the workweek triggers stress and anxiety, which could strain the body’s systems. However, until now, there was limited evidence connecting this idea to biological changes in stress hormones that last beyond the immediate moment.</p>
<p>“I have been working on the topic of cortisol responses to work related stressors for a long time and there is a well-established diurnal rhythm to our cortisol stress response. I wanted to see whether there was a similar pattern to our weekday/weekend stressors,” Chandola, a professor of medical sociology.</p>
<p>To investigate this possibility, the researchers used data from the English Longitudinal Study of Ageing, a large, ongoing project that collects health and lifestyle information from adults aged 50 and older in England. The sample included over 3,500 individuals who had completed both psychological surveys and provided biological samples.</p>
<p>Participants were asked how anxious they felt the day before, rating their experience on a scale from 0 (not at all anxious) to 10 (very anxious). They also indicated which day of the week they were referring to. For the purposes of analysis, the researchers grouped responses into “low” (scores 0–3) and “high” (scores 4–10) anxiety.</p>
<p>In a separate part of the study, trained nurses collected small samples of hair from each participant’s scalp. These samples, measuring about 2 to 3 centimeters, reflect hormone levels accumulated over the previous two to three months. This method allowed the researchers to measure long-term levels of cortisol and cortisone—two hormones produced by the body in response to stress.</p>
<p>Cortisol is often called the “stress hormone” because it plays a major role in the body’s response to psychological pressure. When stress becomes chronic, the body can either produce too much or too little of it, leading to problems in heart health, metabolism, and immune function. By looking at hormone levels in hair, rather than blood or saliva, the researchers were able to get a more stable picture of biological stress over time, rather than just at one moment.</p>
<p>The study found a clear pattern. Older adults who reported feeling anxious on a Monday had significantly higher levels of cortisol in their hair samples than those who reported feeling anxious on other days of the week. This difference was most pronounced among those with the highest cortisol levels. For people in the top 10 percent of the cortisol distribution, those who felt anxious on Mondays had hormone levels that were about 23 percent higher than their peers who felt anxious on other days.</p>
<p>This pattern held true even after the researchers accounted for a wide range of other factors that could influence stress hormone levels, including age, sex, education, relationship status, smoking, hair type, season, and whether the participant was still employed. They also examined whether people who were still working were more affected by Monday anxiety than retirees, but they found no meaningful difference between the two groups. This suggests that Monday-related stress may not be solely about returning to a job, but may reflect a broader psychological association with the start of the week.</p>
<p>“I thought that the retired people would no longer show this pattern of higher cortisol levels correlated with feelings of anxiety on Mondays. The fact that this persisted suggests that the biological consequences of feelings of anxiety on Monday over the life-course do not go away when people retire.”</p>
<p>To better understand the results, the researchers used a statistical method called decomposition analysis. This allowed them to separate the part of the Monday effect that could be explained by known variables—such as higher reported anxiety levels on Mondays—from the part that remained unexplained. They found that about three-quarters of the difference in cortisol levels could not be explained by the measured variables, meaning there may be other factors at play, such as individual sensitivity to weekday routines or long-standing emotional patterns tied to Mondays.</p>
<p>Notably, the link between anxiety and higher cortisol levels was not seen on any other day of the week. People who felt anxious on, say, a Wednesday or Friday did not show the same long-term increase in stress hormone levels as those who felt anxious on a Monday. This highlights the unique psychological weight that Mondays seem to carry for many people, regardless of their day-to-day responsibilities.</p>
<p>“Feeling anxious on Mondays is correlated with long term biological stress responses. The good news is that those feelings of anxiety are not correlated with stress responses on other days of the week, particularly the weekends. This suggest the importance of weekends for rest and recuperation.”</p>
<p>But there are some limitations to consider. The study was observational, which means it cannot prove that feeling anxious on Mondays <em>causes</em> higher stress hormone levels. The data relied on a single measure of anxiety and a single collection of hair samples, so changes in hormone levels over time within the same individual were not tracked. There is also some uncertainty around the exact time period captured by each hair sample, as hair growth rates can vary slightly between people.</p>
<p>Despite these caveats, the researchers argue that the results point to a biological basis for what many people casually refer to as the “Monday blues.” The fact that this effect shows up in long-term hormone patterns suggests it is not just a passing mood, but something that may have lasting implications for health.</p>
<p>The study contributes to a larger body of research on how psychological stress interacts with the body’s systems. It also raises new questions about why some people are more affected by the start of the week than others, and why some never seem to adapt to it—even after leaving the workforce.</p>
<p>Future research will explore what makes certain people resilient to Monday-related anxiety. The researchers hope to investigate what types of routines, personality traits, or coping strategies might protect people from these stress patterns. Understanding these factors could lead to interventions that help people better manage anxiety, particularly at vulnerable times like the beginning of the week.</p>
<p>“Feelings of anxiety could be tied in with stressors related to work or school at the start of the week. I would like to examine what makes some people not feel anxious at the start of the week, as well as the factors that enable some people to be resilient to Monday stress. This research is part of a long line of research on the physiological consequences of work-related stress. It is important to remember that stress is not just a feeling or emotion, but has biological and physiological consequences.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.jad.2025.119611" target="_blank">Are anxious Mondays associated with HPA-axis dysregulation? A longitudinal study of older adults in England</a>,” was authored by Tarani Chandola, Wanying Ling, and Patrick Rouxel.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>