<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/even-in-secular-denmark-supernatural-beliefs-remain-surprisingly-common-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Even in secular Denmark, supernatural beliefs remain surprisingly common, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 28th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Despite Denmark’s reputation as one of the most secular and scientifically literate countries in the world, new research has found that a surprising number of Danes continue to believe in supernatural phenomena. The findings, published in <a href="https://psycnet.apa.org/record/2026-42413-001?doi=1" target="_blank">Evolutionary Behavioral Sciences</a>, suggest that the tendency to believe in spiritual forces, psychic essences, and magical thinking remains widespread even in a highly educated and irreligious society.</p>
<p>“I’m interested in how the mind works and how various types of cognitive bias can sometimes be advantageous and sometimes make us see patterns that aren’t real and to see causal agents where there are none,” explained study author Ken Ramshøj Christensen, an associate professor at Aarhus University.</p>
<p>“Inspired by earlier, primarily American, surveys on supernatural belief that showed striking high levels of isolated supernatural belief, we wanted to find out how widespread different types of supernatural belief is in Denmark, which is otherwise a highly secular country, and to look for potential correlations between different types of belief, i.e. patterns, which most other surveys haven’t done.”</p>
<p>The research team surveyed 2,204 adults from across Denmark using an anonymous online questionnaire that circulated through Facebook, Twitter, and online forums. The survey ran from November 2017 to February 2018 and asked participants to respond to 39 belief-related items using a five-point scale ranging from strong disbelief to strong belief.</p>
<p>Participants indicated their beliefs in a variety of domains: spirituality, supernatural forces, aliens, conspiracy theories, and science. The questions touched on topics such as telepathy, crystal healing, reincarnation, astrology, vaccines, angels, ghosts, and scientific explanations of the universe. The researchers also gathered demographic data, including biological sex, age, education level, and religious identification.</p>
<p>Although the sample was not randomly selected, it was demographically broad and aligned reasonably well with the national population in terms of age and education. The researchers then conducted statistical analyses to examine how belief levels varied across different groups and whether the beliefs clustered in meaningful ways.</p>
<p>The results revealed that supernatural beliefs remain widespread in Denmark, though their prevalence varies by topic. While only a tiny fraction of participants endorsed belief in vampires (0.3%) or zombies (0.6%), belief in a “psychic essence” or soul was endorsed by over 30% of respondents. Roughly 12% said they believed in angels, while around 14% reported belief in astrology. About 25% believed in dream interpretation as a meaningful way to understand the subconscious.</p>
<p>“I found it a bit surprising that 12.3% said they believe in angels, and 10.1% weren’t sure; that’s almost 1 in 5 in total,” Christensen told PsyPost. “For ghost, it’s almost 1 in 4. It’s also interesting that while astrology is popular, only few say they actually believe in it.”</p>
<p>The researchers found that these beliefs were not randomly distributed but tended to fall into coherent clusters. For example, belief in angels often co-occurred with belief in ghosts, reincarnation, and psychic energy, forming what the researchers described as a “spirituality” cluster. Beliefs in clairvoyance, magic, tarot, and crystal healing formed another distinct group, which the researchers called “magical thinking.” Beliefs in alien visitors and vaccine-related conspiracy theories were also correlated and formed part of what the authors labeled “openness to alternative explanations.”</p>
<p>In contrast, belief in scientific explanations—such as human evolution by natural selection or the usefulness of science for explaining mental and emotional life—tended to be negatively correlated with supernatural and magical beliefs. In other words, the more someone believed in scientific reasoning, the less likely they were to believe in paranormal forces, and vice versa.</p>
<p>The researchers also examined how demographic characteristics were linked to belief patterns. One of the strongest predictors was sex. Women were significantly more likely than men to report belief in supernatural ideas across nearly every category, including spirituality, magical thinking, and paranormal phenomena. Men, on the other hand, scored higher in belief in science.</p>
<p>This sex difference might relate to average differences in cognitive style, according to the researchers. For instance, women tend to score higher on empathizing traits—such as understanding others’ emotions—while men tend to score higher on systemizing traits, such as analyzing rules and patterns. These tendencies may affect how people interpret ambiguous events or engage with belief systems.</p>
<p>Education level also played a role. Participants with more years of education tended to report slightly lower levels of supernatural belief, especially in spirituality, conspiracy theories, and magical thinking. However, this effect was small. The researchers noted that while education may offer tools for critical thinking, it does not entirely override cognitive biases or intuitive reasoning.</p>
<p>Religiosity was another strong predictor. People who identified as religious—whether Christian or another tradition—were significantly more likely to endorse supernatural beliefs. Those with no religious affiliation reported the lowest levels of belief in paranormal and spiritual phenomena and the highest levels of belief in science. Still, the boundaries were not absolute. Some nonreligious individuals endorsed belief in concepts like psychic energy or angels, suggesting that supernatural beliefs are not confined to formal religious systems.</p>
<p>Age, by contrast, was not a significant factor. The researchers found no meaningful relationship between a participant’s age and their tendency to believe in supernatural ideas or scientific explanations.</p>
<p>The findings offer insights into how cultural and biological factors work together to influence belief. Even in a society like Denmark—known for its secular values and widespread education—supernatural ideas remain embedded in the population.</p>
<p>“Even though almost two thirds of our participants said they didn’t have any religious belief, there was a fair amount of belief in spirituality and the supernatural, including spirits, ghosts, angels, and telepathy,” Christensen explained. “And length of education only has a tiny effect on supernatural belief. Above all, there was a clear sex difference: More women than men are religious, and women are also significantly more likely to believe in the supernatural. All of this, we think, can be explained in terms of evolved predispositions.”</p>
<p>While the study offers a detailed look at supernatural belief in a secular context, there are limitations. The sample was not randomly selected, and certain regions of Denmark were overrepresented, which could introduce bias. The cross-sectional nature of the study also means it cannot capture how beliefs change over time.</p>
<p>“It would be interesting to see whether the belief in various supernatural phenomena is rising, and whether the correlation patterns are changing, which would require replicating our study,” Christensen said.</p>
<p>Still, the researchers argue that the large sample size and the consistency of findings with existing research strengthen the conclusions. Future studies could examine whether supernatural beliefs are increasing or decreasing in different societies, how they respond to major cultural events, or whether education systems can shape these beliefs in lasting ways.</p>
<p>The results also suggest a need to explore the psychological functions that supernatural beliefs might serve, especially when they are not tied to organized religion. Beliefs in spirits, angels, or intuitive healing may offer emotional comfort, community identity, or a sense of control—particularly during uncertainty.</p>
<p>The study, “<a href="https://psycnet.apa.org/doi/10.1037/ebs0000386" target="_blank">Evolved Minds in a Secular World: A Large-Scale Survey of Supernatural Beliefs in Denmark</a>,” was authored by Ken Ramshøj Christensen and Mathias Clasen.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-major-new-study-charts-the-surprising-ways-people-weigh-ais-risks-and-benefits/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A major new study charts the surprising ways people weigh AI’s risks and benefits</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 28th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research provides evidence that people’s evaluations of artificial intelligence are shaped more strongly by perceived usefulness than by fears about safety or harm. The study, published in <em><a href="https://doi.org/10.1016/j.techfore.2025.124304" target="_blank">Technological Forecasting and Social Change</a></em>, sheds light on how the public navigates the trade-offs between expected benefits and potential risks across a wide range of AI applications—from healthcare and transportation to surveillance and warfare.</p>
<p>Rapid advances in artificial intelligence, particularly in tools like large language models, have generated a mix of optimism and anxiety. While these technologies promise to improve efficiency, support medical diagnoses, and enable new forms of creativity, they also raise concerns about surveillance, misinformation, automation, and power imbalances. Public sentiment, in this context, plays a key role in determining how AI is adopted, governed, and developed. If the public sees AI as untrustworthy or irrelevant to daily life, even highly capable systems may struggle to gain acceptance.</p>
<p>Despite the growing literature on AI ethics and trust, most previous research has either examined attitudes toward specific technologies—such as self-driving cars—or asked people for general impressions of AI. Few studies have attempted to map the full terrain of public opinion across a broad range of future possibilities. This study aimed to fill that gap by systematically analyzing how people balance expectations, perceived risks, and potential benefits across a wide array of imagined AI applications.</p>
<p>“We were motivated by both personal interests and the real-world challenges posed by AI’s rapid expansion. Personally, I use AI in many ways: it helps me write more efficiently, creates playful drawing prompts for my toddlers, and accelerates my learning,” said study author <a href="https://www.comm.rwth-aachen.de/cms/comm/der-lehrstuhl/team/~dcrkt/dr-philipp-brauner/?allou=1&lidx=1" target="_blank">Philipp Brauner</a>, a postdoctoral researcher at the Human-Computer Interaction Center and Chair of Communication Science at RWTH Aachen University.</p>
<p>“At the same time, I’m concerned about its downsides: How much should AI know about me? Who regulates it, and how effective is that regulation? Will humans remain in control? These questions are widely discussed not only in academic circles but also in the media and the public sphere.”</p>
<p>“Given the unprecedented speed of AI development, it is difficult even for experts to maintain an overview of AI and its implications. Our goal was to capture how people currently view AI, across a broad range of applications, and to put those perceptions into context.”</p>
<p>For their study, the researchers recruited 1,100 participants from Germany, using an online survey that was designed to reflect the national population in terms of age, gender, and regional background. The median age of participants was 51 years. The researchers presented each participant with a randomized subset of 15 micro-scenarios, drawn from a pool of 71 brief statements describing hypothetical AI developments expected within the next ten years.</p>
<p>These scenarios covered a wide range of domains, including AI-generated art, autonomous warfare, healthcare, criminal justice, and emotional relationships. Participants rated each scenario along several dimensions: how likely they thought it was to occur, how personally risky it seemed, how useful or beneficial they believed it would be, and whether they viewed it as positive or negative overall.</p>
<p>In addition to scenario ratings, the researchers collected demographic data and administered short measures of personality traits, including technology readiness, interpersonal trust, self-efficacy, openness, and familiarity with AI systems.</p>
<p>Statistical models were used to analyze both how individual characteristics shaped AI evaluations and how the scenarios themselves were rated in terms of public sentiment. Risk–benefit maps and expectancy–value plots were also created to help visualize the landscape of AI perception.</p>
<p>“AI is widely recognized as a transformative technology affecting individuals, organizations, and society in countless ways,” Brauner told PsyPost. “This has led to a surge of research on public perceptions, expectations, and design requirements. However, most studies either focus narrowly on specific applications or address AI in very general terms.” </p>
<p>“Our study stands out in that it covers a wide variety of applications and imaginary futures and visualizes them in a ‘cognitive map’ of public perceptions. This perspective helps us understand overall risk–benefit tradeoffs, while also allowing other researchers to situate their more focused studies within the broader landscape. For instance, they can see whether their work addresses domains viewed as relatively safe and valuable, or ones considered more controversial.”</p>
<p>Across all scenarios, people tended to view future AI developments as fairly likely to occur. However, perceived risk was higher than perceived benefit on average, and overall sentiment was generally negative. Only a minority of scenarios were rated positively, and even those were often seen as risky.</p>
<p>The most positively rated scenarios included AI assisting with health improvement, providing conversation support for the elderly, and serving as a helper for everyday tasks. These were seen as both beneficial and relatively safe. In contrast, scenarios involving AI being used in warfare, making life-and-death decisions, or monitoring private life were rated as highly risky and strongly negative.</p>
<p>“We are often surprised by how participants evaluate certain topics,” Brauner said. “To look at just one example, the statement ‘AI will know everything about me’ was unsurprisingly rated as negative. Yet many people freely share personal details on social media or even with AI chatbots, and much of this data is indeed used to train models. This resembles the ‘privacy paradox’ observed in online privacy research: people express concerns yet behave in ways that contradict them.”</p>
<p>Some scenarios were viewed as likely to occur but also undesirable — such as AI being misused by criminals or automating surveillance. Others, such as AI helping people have better relationships or acting with a sense of responsibility, were seen as unlikely and received little support.</p>
<p>Notably, the perceived likelihood of a scenario had little or no correlation with its perceived benefit or risk. People seemed to evaluate the usefulness and moral desirability of AI independently from whether they believed the technology would become reality.</p>
<p>“Our representative sample of people in Germany believes AI is here to stay,” Brauner said. “However, across the many topics we surveyed, they tended to see AI as risky, with limited benefits and lower overall value. This creates a gap between public perceptions and the optimism often voiced by business leaders and politicians. Whether this reflects a specifically cultural perspective (driven by ‘German Angst?’) or our choice of topics remains an open question, and we would like to replicate the study in other countries.”</p>
<p>When the researchers used regression analysis to predict overall value judgments, they found that perceived benefit was the strongest predictor by far. Risk also played a role, but it had less weight in shaping general attitudes. Together, these two factors explained over 96 percent of the variation in how people evaluated each AI scenario. Perceived likelihood had no significant predictive power.</p>
<p>“Somewhat surprisingly, value perceptions were influenced more strongly by perceived benefits than by perceived risks,” Brauner explained. “In other words, if developers or policymakers want to improve acceptance of AI, highlighting tangible benefits may be more effective than trying to mitigate risks. This is a controversial finding, given the real dangers of issues like data breaches or AI-generated misinformation.”</p>
<p>Individual differences also mattered. Older participants tended to rate AI as more risky and less beneficial, and their overall evaluations were more negative than those of younger participants. Gender played a minor role, with women reporting slightly lower overall sentiment toward AI. </p>
<p>However, the strongest individual predictors of positive attitudes were technology readiness and AI familiarity. People who reported feeling comfortable with technology or who had more exposure to AI systems were more likely to rate AI scenarios as beneficial and less likely to view them as threatening.</p>
<p>“Older participants tended to see more risks fewer benefits, and lower value, while women generally assigned lower value to AI,” Brauner told PsyPost. “However, these differences were moderated by ‘AI readiness’ (knowledge and familiarity with the technology). This points to the importance of education: AI literacy should be integrated into school and university curricula, and individuals should also seek to better understand AI’s possibilities and limitations. AI will remain part of our lives, so society needs to make informed decisions about where it can be trusted (e.g., navigation) and where human oversight is indispensable (e.g., life-and-death decisions).”</p>
<p>The researchers also asked participants to identify the most important focus for AI governance. The top response, chosen by 45 percent, was ensuring human control and oversight. Other priorities included transparency, data protection, and social well-being.</p>
<p>Although the study offers a broad overview of AI attitudes across a wide range of domains, there are some limitations to consider. Because the survey included 71 different scenarios, participants rated only a small portion of them. Each scenario was evaluated briefly, with single-item scales, which may not capture the full complexity of people’s thoughts or values.</p>
<p>“The breadth of our approach is both its strength and its limitation,” Brauner said. “We can provide a ‘bird’s-eye view’ of AI as a transformative technology, but we cannot uncover the detailed motivations, benefits, and barriers for specific applications. That is the task of more focused studies, many of which are already published or underway.”</p>
<p>“Another limitation is the cultural context. Our survey reflects German perspectives, which may be influenced by cultural tendencies such as “German angst”. It is essential to replicate this work in other countries and regions.”</p>
<p>“We have already <a href="https://arxiv.org/abs/2412.13841" target="_blank">conducted a small exploratory study</a> comparing German and Chinese students and found both different tradeoffs nd differences in the absolute evaluations, but that sample was not representative and we could not control for key variables such as technology use,” Brauner continued. “Much more cross-cultural research is needed, and we would welcome collaborations with other researchers.”</p>
<p>“Looking ahead, we hope to refine and update our ‘map’ regularly to track how perceptions evolve as AI becomes more integrated into everyday life. But we also plan to focus more closely on specific applications: for example, AI as a decision aid and investigate how evaluations change when people experience errors or failures in these systems and how trust can be restored.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.techfore.2025.124304" target="_blank">Mapping public perception of artificial intelligence: Expectations, risk–benefit tradeoffs, and value as determinants for societal acceptance</a>,” was authored by Philipp Brauner, Felix Glawe, Gian Luca Liehner, Luisa Vervier, and Martina Ziefle.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/study-links-phubbing-sensitivity-to-attachment-patterns-in-romantic-couples/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Study links phubbing sensitivity to attachment patterns in romantic couples</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 28th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in the <em><a href="https://doi.org/10.1111/jopy.70012" target="_blank">Journal of Personality</a></em> sheds light on how attachment styles shape the way people respond to “phubbing”—that is, when a romantic partner is more focused on their phone than on face-to-face interaction. The findings suggest that people with higher levels of attachment anxiety tend to experience more emotional distress on days when they feel phubbed, including lower self-esteem and a greater sense of depression. They also appear more likely to retaliate.</p>
<p>Phubbing, a combination of the words “phone” and “snubbing,” refers to the perception that someone is ignoring you in favor of their phone. In romantic relationships, phubbing has been linked to lower relationship satisfaction, increased conflict, and reduced emotional well-being. As smartphones become more integrated into daily life, researchers are growing more interested in how technology use can interfere with face-to-face connections and shape emotional experiences within couples.</p>
<p>“Phones are everywhere—we use them for work, keeping in touch, entertainment, even paying for things or finding our way around,” said study author <a href="https://www.southampton.ac.uk/people/5x2fjv/doctor-claire-hart#research" target="_blank">Claire Hart</a>, an associate professor at the University of Southampton. “But I kept noticing how often, especially in restaurants, people end up more focused on their screens than on each other. That got me wondering what happens when this plays out at home, and how our relationships are affected when the phone takes priority over the person right in front of us.”</p>
<p>Previous studies suggest that when individuals feel ignored by their partner’s phone use, they may react with resentment or even mimic the behavior themselves. However, not everyone responds to phubbing in the same way. This variation led the authors of the new study to investigate whether individual differences in adult attachment—especially attachment anxiety and avoidance—could help explain these different reactions.</p>
<p>Attachment theory provides a framework for understanding how people relate to close others. Those with high attachment anxiety tend to fear abandonment and are especially sensitive to signs of rejection. In contrast, individuals high in attachment avoidance tend to distance themselves emotionally and feel uncomfortable with closeness. These patterns of thinking and feeling about relationships are shaped early in life and tend to persist into adulthood.</p>
<p>Hart and her colleagues recruited 196 adults who were living with a romantic partner in a relationship lasting at least six months. The average age of participants was about 36 years old, and most identified as female and heterosexual. Over the course of 10 days, participants completed an online diary survey. The first survey collected demographic information and measured attachment anxiety and avoidance. On the remaining days, participants reported on daily experiences of being phubbed, how they responded, and how they felt emotionally and relationally.</p>
<p>To measure perceived phubbing, participants responded to items like “My partner glanced at their phone while talking to me” and “My partner’s phone use interfered with our interactions.” They also rated their daily relationship satisfaction, self-esteem, mood, and anger. If they felt phubbed, they were asked how they responded—whether they felt resentful, curious, picked up their own phone in retaliation, or ignored the behavior. If they retaliated, they were asked to rate their motives, including whether they were seeking revenge, were bored, or looking for support or approval from others.</p>
<p>The researchers found that on days when people perceived more phubbing, they tended to report lower relationship satisfaction, more anger, and increased anxious mood. These effects were seen across the entire sample.</p>
<p>However, individuals with higher attachment anxiety experienced stronger emotional reactions. On days when they felt phubbed, they were more likely to report lower self-esteem and greater feelings of depression. These effects were not observed among participants with lower attachment anxiety, indicating that people with this trait may be more vulnerable to the emotional consequences of feeling ignored.</p>
<p>People high in attachment anxiety were also more likely to report retaliating against perceived phubbing. Their retaliatory behavior was commonly driven by a desire for support and approval. This aligns with previous research suggesting that people with anxious attachment often seek reassurance and connection, especially when they feel rejected or excluded.</p>
<p>“Not everyone experiences phubbing the same way,” Hart told PsyPost. “Attachment style – the habitual way people think and feel about relationships – plays a big role. People who are more anxious about being abandoned or who need lots of reassurance reacted more strongly when phubbed. They reported higher depressed mood, lower self-esteem, and greater resentment. They were also more likely to retaliate (pick up their own phone and start phubbing their partner) – to get support and approval from others in order to get their attachment needs met. While this kind of retaliation might offer immediate comfort, it can create a cycle of negative interactions.”</p>
<p>On the other hand, participants with high attachment avoidance were less likely to engage in conflict when they felt phubbed, consistent with their discomfort with emotional confrontation, but they were more likely to say they retaliated out of a desire for approval. This was somewhat unexpected, given that avoidant individuals tend to downplay the importance of close relationships. </p>
<p>“One thing that surprised us was how people who usually prefer distance in relationships (those high in attachment avoidance) still showed a stronger need for approval when they felt their partner was phubbing them,” Hart explained. “In other words, even people who normally don’t like to rely on others seemed motivated to seek some kind of validation when ignored by their partner for a phone.”</p>
<p>“We don’t yet know exactly what kind of approval they were looking for. It might be about presenting themselves well on social media, showing off achievements, or just looking for attention in a less personal way. Future research could delve into this by asking what people actually do when they retaliate—are they messaging friends, posting online, or just scrolling? And do those behaviours vary depending on someone’s attachment style?”</p>
<p>Across the entire sample, phubbing triggered a range of emotional and behavioral responses. Participants often reported feeling resentful or curious about their partner’s phone use. They were also more likely to retaliate or confront their partner when phubbing was perceived as high. These findings are consistent with the idea that even brief moments of perceived disconnection can carry emotional weight and influence how people behave in their relationships.</p>
<p>But the study, like all research, includes some caveats. The sample was not particularly diverse in terms of gender or sexual orientation, and most participants were heterosexual women. Future studies should aim to include more diverse participants to examine whether these findings hold across different relationship types and identities.</p>
<p>The study also relied entirely on self-reported data, which can be influenced by memory biases or social desirability. Additionally, the study focused on perceived phubbing, not on whether the partner was actually phubbing, which could differ in important ways.</p>
<p>“Looking ahead, we want to go beyond self-reports and see what’s happening in the body when someone is phubbed,” Hart said. “Does it change heart rate or stress levels, and do those physical reactions match up with what people say they feel—like anger, anxiety, or sadness? People might not always notice how strongly they’re reacting, or they may downplay it. By measuring the body’s response, we can get a fuller picture of how phubbing really affects people, both emotionally and physically. That insight can help us understand the true impact of phone use on our relationships and well-being.”</p>
<p>“Phubbing is becoming such a common part of modern life, yet we don’t fully understand how it affects our relationships and well-being. Phones aren’t going away, so understanding when and why they cause harm can help people use them more mindfully.”</p>
<p>The study, “<a href="https://doi.org/10.1111/jopy.70012" target="_blank">Attachment, Perceived Partner Phubbing, and Retaliation: A Daily Diary Study</a>,” was authored by Katherine B. Carnelley, Claire M. Hart, Laura M. Vowels, and Tessa Thejas Thomas.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-common-childhood-virus-could-be-silently-fueling-alzheimers-disease-in-old-age/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A common childhood virus could be silently fueling Alzheimer’s disease in old age</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 27th 2025, 21:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>The common cold sore virus, which is often caught in childhood, usually stays in the body for life – quietly <a href="https://www.who.int/news-room/fact-sheets/detail/herpes-simplex-virus">dormant</a> in the nerves. Now and then, things like stress, illness or injury can trigger it, bringing on a cold sore in some people. But this same virus – called herpes simplex virus type 1 – may also play an important role in something far more serious: Alzheimer’s disease.</p>
<p>Over 30 years ago, my colleagues and I made a <a href="https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(96)10149-5/abstract">surprising discovery</a>. We found that this cold sore virus can be present in the brains of older people. It was the first clear sign that a virus could be quietly living in the brain, which was long thought to be completely germ-free – protected by the so-called “blood-brain barrier”.</p>
<p>Then we discovered something even more striking. People who have a certain version of a gene (called APOE-e4) that increases their risk of Alzheimer’s, and who have been infected with this virus, have a risk that is <a href="https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(96)10149-5/abstract">many times greater</a>.</p>
<p>To investigate further, we studied brain cells that we <a href="https://www.sciencedirect.com/science/article/abs/pii/S0304394007010786?via%3Dihub">infected with the virus</a>. They produced the same abnormal proteins (amyloid and tau) found in the brains of people with Alzheimer’s.</p>
<p>We believe that the virus stays mainly dormant in the body for years – possibly decades. But later in life, as the immune system gets weaker, it can enter the brain and reactivate there. When it does, it will damage brain cells and trigger inflammation. Over time, repeated flare-ups could gradually cause the kind of damage that leads to Alzheimer’s in some people.</p>
<p>We later found the virus’s DNA <a href="https://pubmed.ncbi.nlm.nih.gov/18973185/">inside the sticky clumps of these proteins</a>, which are found in the brains of Alzheimer’s patients. Even more encouragingly, antiviral treatments reduced this damage in the lab, suggesting that drugs might one day help to slow or even prevent the disease.</p>
<p><a href="https://pubmed.ncbi.nlm.nih.gov/39956964/">Large population studies</a> by others found that severe infections, specifically with the cold sore virus, was a strong predictor of Alzheimer’s, and that specific antiviral treatment <a href="https://pubmed.ncbi.nlm.nih.gov/34136638/">reduced the risk</a>.</p>
<p>Our research didn’t stop there. We <a href="https://journals.sagepub.com/doi/abs/10.3233/JAD-220287">wondered if</a> other viruses that lie dormant in the body might have similar effects – such as the one responsible for chickenpox and shingles.</p>
<h2>Shingles vaccine offers another clue</h2>
<p>When we studied health records from hundreds of thousands of people in the UK, we saw something interesting. People who had shingles had only a slightly higher risk of developing dementia. Yet those who had the shingles vaccine were <a href="https://pubmed.ncbi.nlm.nih.gov/34625411/">less likely</a> to develop dementia at all.</p>
<p>A <a href="https://jamanetwork.com/journals/jama/fullarticle/2833335">new Stanford University-led study</a> gave similar results.</p>
<p>This supported our long-held proposal that preventing common infections could lower the risk of Alzheimer’s. Consistently, studies by others showed that infections were indeed a risk and that some other vaccines were protective against Alzheimer’s.</p>
<p>We <a href="https://www.science.org/doi/10.1126/scisignal.ado6430">then explored</a> how risk factors for Alzheimer’s such as infections and head injuries could trigger the hidden virus in the brain.</p>
<p>Using an advanced 3D model of the brain with a dormant herpes infection, we found that when we introduced other infections or simulated a brain injury, the cold sore virus reactivated and caused damage similar to that seen in Alzheimer’s. But when we used a treatment to reduce inflammation, the virus stayed inactive, and the damage didn’t happen.</p>
<p>All of this suggests that the virus that causes cold sores could be an important contributor to Alzheimer’s, especially in people with certain genetic risk factors. It also opens the door to possible new ways of preventing the disease, such as vaccines or antiviral treatments that stop the virus from waking up and harming the brain.</p>
<p>What began as a link between cold sores and memory loss has grown into a much bigger story – one that may help us understand, and eventually reduce, the risk of one of the most feared diseases of our time.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/254656/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/thirty-years-on-our-research-linking-viral-infections-with-alzheimers-is-finally-getting-the-attention-it-deserves-254656">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/its-not-social-media-whats-really-fueling-trump-shooting-conspiracies-might-surprise-you/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">It’s not social media: What’s really fueling Trump shooting conspiracies might surprise you</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 27th 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1093/pnasnexus/pgaf193" target="_blank">PNAS Nexus</a></em> suggests that people are more likely to believe conspiracy theories about the 2024 attempted assassination of Donald Trump if they heard them from people they know rather than from social media or news outlets. While conspiracy theories spread widely across platforms like Facebook and X (formerly Twitter), the researchers found that interpersonal networks played a stronger role in shaping belief.</p>
<p>The study was motivated by a growing need to better understand how conspiracy theories take hold and spread, particularly in the aftermath of high-profile, emotionally charged events. Previous research has often centered on individual psychological traits, such as a tendency toward suspicion or political extremism, to explain why some people are more prone to conspiratorial beliefs. While these factors remain important, the current study aimed to explore a less studied domain: the role of communication networks and interpersonal influence.</p>
<p>The authors used the July 2024 assassination attempt on Donald Trump as a test case because it quickly became a flashpoint for politically charged misinformation. As with past events like the assassination of John F. Kennedy, the suddenness and ambiguity of the Trump shooting generated intense speculation and competing narratives. </p>
<p>Almost immediately, people began sharing theories across the political spectrum—some suggesting Democratic operatives were behind the attack, others alleging the whole event was staged to benefit Trump politically. This combination of salience, controversy, and rapid information flow created an ideal environment to examine how beliefs about conspiracy theories form and spread.</p>
<p>The research team collected survey data from a nationally diverse sample of 2,765 U.S. adults between July 17 and 21, just days after the shooting took place. The survey was conducted online through a non-probability sample, with efforts made to balance demographics like age, race, gender, and geographic region. To further improve representativeness, the researchers applied post-stratification weights based on census and voting data.</p>
<p><strong><em><a href="https://www.psypost.org/psypost-newsletter/" target="_blank" rel="noopener">Stay informed with the latest psychology and neuroscience research—sign up for PsyPost’s newsletter and get new discoveries delivered straight to your inbox.</a></em></strong></p>
<p>Participants were first asked if they had heard about the Trump assassination attempt. Those who were aware of the incident were then asked where they had received their information—options included television, radio, newspapers, social media, news websites, podcasts, or from people they know. Respondents could select multiple sources. The researchers used this information to examine how different channels of communication were linked to awareness and belief in two specific conspiracy theories: one claiming that Democratic operatives orchestrated the shooting, and another alleging that the event was staged altogether.</p>
<p>To assess belief in the theories, participants who had heard them were asked how likely they thought each was to be true. Responses ranged from “very unlikely” to “very likely” on a five-point scale. The researchers also collected demographic data, political orientation, approval of Trump, general interest in politics, and a standard measure of conspiratorial thinking known as the American Conspiracy Thinking Scale.</p>
<p>Statistical analyses included logistic regression to determine predictors of exposure to conspiracy theories and linear regression to assess what factors were associated with belief.</p>
<p>Nearly all respondents (95%) reported being aware of the assassination attempt. The majority of those who were informed said they got their information from television (64%), followed by social media (43%), and personal contacts (30%).</p>
<p>Of those surveyed, 41% had heard the theory that Democratic operatives were behind the attack. Among this group, 53% had seen the claim on social media, 28% had seen it on television, and 32% had heard it from people they knew. Roughly 29% of those exposed to this theory believed it was likely to be true.</p>
<p>The second theory—that the event was staged—was even more widely circulated, with 53% of participants reporting they had encountered it. Of these, 52% saw it on social media, 34% heard it from personal contacts, and 21% saw it on television. About 29% of those who had heard this theory said they believed it was likely.</p>
<p>Social media appeared to be the main vector for initial exposure to both theories. However, when the researchers examined what influenced actual belief, a different picture emerged. People who heard the conspiracy theories through interpersonal networks were significantly more likely to believe them. This pattern held across both left-leaning and right-leaning narratives.</p>
<p>In contrast, social media use was not strongly linked to belief in the theories once exposure was accounted for. While it increased the likelihood of encountering conspiracy content, it did not appear to increase the chance that people would believe it. This finding runs counter to the often-repeated assumption that social media is the primary engine of conspiracy belief.</p>
<p>Other factors associated with belief included approval of Trump, political partisanship, and a higher score on the conspiratorial thinking scale. These variables tended to predict belief in expected ways. For instance, Republicans and Trump supporters were more likely to believe the Democratic operative theory, while Democrats were somewhat more open to the idea that the event was staged.</p>
<p>Notably, among all the information sources analyzed, interpersonal communication was the only one consistently and positively associated with belief in both conspiracy theories. Hearing about a conspiracy from someone personally known increased the perceived likelihood that the theory was true by 0.2 to 0.4 points on a 0–4 scale.</p>
<p>These findings indicate that conspiratorial thinking is not just a function of individual psychology or online media exposure, but also a social process embedded in everyday relationships. However, there are some limitations to consider. The survey used a non-probability sample, which may not fully represent the U.S. population despite statistical adjustments. </p>
<p>In addition, the study was observational, so it cannot determine causation. It remains unclear whether people adopt conspiracy beliefs because of their social networks or whether they seek out relationships with people who already share those beliefs. Longitudinal or experimental research would be needed to untangle these possibilities.</p>
<p>Future research could explore how strong or confident people are in their beliefs, not just whether they accept a theory. Another promising direction would be to examine the structure of conspiratorial social networks: are there particular patterns of relationship or communication styles that make belief more likely to take hold? Understanding these dynamics could help in designing more effective interventions aimed at reducing the social spread of misinformation.</p>
<p>The researchers also suggest that future studies should consider how interpersonal influence interacts with other media, such as algorithm-driven news feeds, and how conversations about conspiracy theories evolve over time.</p>
<p>The study, “<a href="https://doi.org/10.1093/pnasnexus/pgaf193" target="_blank">Information from social ties predicts conspiracy beliefs: Evidence from the attempted assassination of Donald Trump</a>,” was authored by Katherine Ognyanova, James N. Druckman, Jonathan Schulman, Matthew A. Baum, Roy H. Perlis, and David Lazer.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/autisms-odd-gait-autistic-movement-differences-linked-to-brain-development/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Autism’s “odd gait”: Autistic movement differences linked to brain development</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 27th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Autism is a neurodevelopmental condition that affects how people’s brains develop and function, impacting behaviour, communication and socialising. It can also involve differences in the way you move and walk – known as your “<a href="https://www.researchgate.net/publication/357844247_Gait_analysis_overview_trends_and_challenges">gait</a>”.</p>
<p>Having an “odd gait” is now listed in the Diagnostic and Statistical Manual of Mental Disorders as a <a href="https://a4.org.au/book/export/html/1568#:~:text=Associated%20Features%20Supporting%20Diagnosis&text=Even%20those%20with%20average%20or,greatest%20in%20the%20adolescent%20years.">supporting diagnostic feature</a> of autism.</p>
<h2>What does this look like?</h2>
<p>The most noticeable gait differences among autistic people are:</p>
<ul>
<li>toe-walking, walking on the balls of the feet</li>
<li>in-toeing, walking with one or both feet turned inwards</li>
<li>out-toeing, walking with one or both feet turned out.</li>
</ul>
<p>Research has also identified more subtle differences. A study summarising <a href="https://onlinelibrary.wiley.com/doi/epdf/10.1002/aur.2443">30 years of research among autistic people reports</a> that gait is characterised by:</p>
<ul>
<li>walking more slowly</li>
<li>taking wider steps</li>
<li>spending longer in the “stance” phase, when the foot leaves the ground</li>
<li>taking more time to complete each step.</li>
</ul>
<p>Autistic people show much more <a href="https://onlinelibrary.wiley.com/doi/epdf/10.1002/aur.2443">personal variability</a> in the length and speed of their strides, as well as their walking speed.</p>
<p>Gait differences also tend to occur alongside other motor differences, such as issues with balance, coordination, postural stability and handwriting. Autistic people <a href="https://pubmed.ncbi.nlm.nih.gov/20195737/">may need support</a> for these other motor skills.</p>
<h2>What causes gait differences?</h2>
<p>These are largely due to <a href="https://jamanetwork.com/journals/jamaneurology/article-abstract/592653">differences</a> in <a href="https://jamanetwork.com/journals/jamaneurology/article-abstract/580155">brain development</a>, specifically in areas known as the <a href="https://www.cambridge.org/core/journals/developmental-medicine-and-child-neurology/article/abs/gait-function-in-newly-diagnosed-children-with-autism-cerebellar-and-basal-ganglia-related-motor-disorder/FE4B64AB25C010412AF00904E20E3FFB">basal ganglia</a> and <a href="https://link.springer.com/article/10.1007/s00787-006-0530-y">cerebellum</a>.</p>
<p>The basal ganglia are <a href="https://movementdisorders.onlinelibrary.wiley.com/doi/abs/10.1002/mds.870130310">broadly responsible</a> for sequencing movement including through shifting posture. It ensures your gait appears effortless, smooth and automatic.</p>
<p>The cerebellum then uses visual and proprioceptive information (to sense the body’s position and movement) to adjust and time movements to <a href="https://journals.sagepub.com/doi/abs/10.1177/1073858404263517">maintain postural stability</a>. It <a href="https://link.springer.com/article/10.1080/14734220601187741">ensures</a> movement is controlled and coordinated.</p>
<p>Developmental differences in these brain regions <a href="https://pubmed.ncbi.nlm.nih.gov/16182941/">relate</a> to the way the areas look (their structure), how they work (their function and activation) and how they “speak” to other areas of the brain (their connections).</p>
<p>While some researchers have suggested that autistic gait occurs due to delayed development, we now know gait differences persist across the lifespan. Some differences actually become <a href="https://onlinelibrary.wiley.com/doi/epdf/10.1002/aur.2443">clearer with age</a>.</p>
<p>In addition to brain-based differences, the autistic gait is also <a href="https://www.sciencedirect.com/science/article/pii/S1750946724001326">associated with</a> factors such as the person’s broader motor, language and cognitive capabilities.</p>
<p>People with more complex support needs might have more pronounced gait or motor differences, together with language and cognitive difficulties.</p>
<p>Motor dysregulation <a href="https://www.frontiersin.org/journals/integrative-neuroscience/articles/10.3389/fnint.2025.1634265/full">might indicate sensory or cognitive overload</a> and be a useful marker that the person might benefit from extra support or a break.</p>
<h2>How is it managed?</h2>
<p>Not all differences need to be treated. Instead, clinicians take an individualised and goals-based approach.</p>
<p>Some autistic people might have subtle gait differences that are observable during testing. But if these differences don’t impact a person’s ability to participate in everyday life, they don’t require support.</p>
<p>An autistic person is likely to benefit from support for gait differences if they have a functional impact on their daily life. This might include:</p>
<ul>
<li>increased risk of, or frequent, falls</li>
<li>difficulty participating in the physical activities they enjoy</li>
<li>physical consequences such as tightness of the Achilles and calf muscles, or associated pain in other areas, such as the feet or back.</li>
</ul>
<p>Some children may also benefit from support for motor skill development. However this doesn’t have to occur in a clinic.</p>
<p>Given children spend a large portion of their time at school, programs that integrate opportunities for movement throughout the school day allow autistic children to develop motor skills outside of the clinic and alongside peers. We developed the <a href="https://www.monash.edu/medicine/psych/research/neurodevelopment/allplay-child-and-family-program/joy-of-moving-program-in-australia">Joy of Moving Program in Australia</a>, for example, which gets students moving in the classroom.</p>
<p>Our <a href="https://www.sciencedirect.com/science/article/pii/S175094672300171X">community-based intervention studies</a> <a href="https://link.springer.com/article/10.1007/s10803-021-04933-w">show</a> autistic children’s movement abilities can improve after engaging in community-based interventions, such as sports or dance.</p>
<p>Community-based support models empower autistic children to have agency in how they move, rather than seeing different ways of moving as a problem to be fixed.</p>
<h2>Where to from here?</h2>
<p>While we have learnt a lot about autistic gait at a broad level, researchers and clinicians are still seeking a better understanding of why and when individual variability occurs.</p>
<p>We’re also still determining how to best support individual movement styles, including among children as they develop.</p>
<p>However there is <a href="https://www.frontiersin.org/journals/pediatrics/articles/10.3389/fped.2025.1475019/full">growing evidence</a> that physical activity enhances social skills and behavioural regulation in preschool children with autism.</p>
<p>So it’s encouraging that states and territories are moving towards more <a href="https://theconversation.com/how-to-reform-the-ndis-and-better-support-disabled-people-who-dont-qualify-for-it-258799">community-based foundational supports</a> for autistic children and their peers, as governments develop supports outside the National Disability Insurance Scheme (NDIS).</p>
<p> </p>
<p><em>The authors thank the late <a href="https://www.monash.edu/vale/home/articles/vale-emeritus-professor-johnson-lockyer-bradshaw">Emeritus Professor John Bradshaw</a> for his early input into this piece.</em><!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/231685/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/why-do-some-autistic-people-walk-differently-231685">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-achieve-striking-memory-improvements-by-suppressing-brain-protein/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists achieve “striking” memory improvements by suppressing brain protein</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 27th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1038/s43587-025-00940-z" target="_blank" rel="noopener">Nature Aging</a></em> provides evidence that a single protein in the brain may play a central role in age-related memory loss—and that suppressing this protein could restore cognitive function in older animals. Researchers at the University of California, San Francisco found that increasing levels of a protein called ferritin light chain 1 (Ftl-1) in neurons impairs memory and synaptic function in young mice, while reducing its levels in aged mice rejuvenates brain function.</p>
<p>Cognitive decline is a common feature of aging, even in the absence of disease. Past work has shown that this decline is not primarily caused by neuron death, but rather by changes in how neurons function, especially at their synapses—points of communication between cells that are essential for learning and memory.</p>
<p>The research team sought to identify specific molecules that contribute to this decline and could potentially be targeted to reverse it. The hippocampus, a brain region known for its role in memory, is especially vulnerable to aging. The researchers reasoned that if they could uncover the molecular drivers of age-related hippocampal dysfunction, it might open the door to therapeutic interventions for cognitive aging—and perhaps even for age-related diseases like Alzheimer’s.</p>
<p>“Since I was a kid, I’ve always felt passionate about the brain, and in the last few years I’ve become really excited about aging. Joining Dr. Saul Villeda’s lab at UCSF as a postdoc gave me the chance to connect my two passions—brain and aging—and work on the mechanisms responsible for brain aging,” said Laura Remesal, the lead author of the new study and the founding scientist of Babylon Biosciences.</p>
<p>The study began by examining changes in gene expression in neurons taken from the hippocampi of young and aged mice. Using RNA sequencing, the team identified dozens of genes that were expressed at higher or lower levels with age. When they compared these transcriptional changes with age-related shifts in protein expression, measured by mass spectrometry, one molecule stood out: Ftl-1.</p>
<p>Ftl-1 is a component of ferritin, a protein complex that stores iron in cells. The researchers found that Ftl-1 was elevated in the hippocampal neurons of aged mice and that its expression levels were strongly linked to poorer performance on memory tasks. This correlation suggested that Ftl-1 might be more than just a marker of aging—it could be a contributing factor.</p>
<p>To test this idea, the team used a virus-based method to increase Ftl-1 expression specifically in the hippocampal neurons of young mice. The consequences were significant. These mice exhibited structural changes in their neurons, including shorter dendrites and fewer synapses—features typically associated with aging. They also performed worse on memory tests, showing little interest in exploring new objects or novel maze arms, in contrast to healthy young controls.</p>
<p>Next, the researchers reversed course. They used several approaches—short hairpin RNA, CRISPR gene editing, and conditional knockout models—to reduce Ftl-1 levels in the hippocampi of aged mice. Remarkably, this intervention led to improved memory performance. Aged mice with reduced Ftl-1 performed better on tests of recognition and spatial memory, and their hippocampal neurons showed more youthful characteristics, including restored synaptic markers and improved signaling.</p>
<p>“The degree of improvement in memory and synaptic measures were striking,” Remesal told PsyPost. “It suggested that changing a single aging-related factor can produce meaningful functional gains.”</p>
<p>To understand how Ftl-1 affects neuron function, the team looked more closely at cellular processes. They found that Ftl-1 overexpression disrupted the balance between different oxidation states of iron within neurons, increasing the level of oxidized iron. This shift can interfere with mitochondrial function, particularly the cell’s ability to produce ATP, which is critical for energy-intensive processes like maintaining synaptic activity.</p>
<p>Indeed, neurons overexpressing Ftl-1 showed diminished ATP production. In contrast, knocking down Ftl-1 enhanced cellular energy output. These results pointed to a link between iron metabolism, mitochondrial health, and brain aging.</p>
<p><strong><em><a href="https://www.psypost.org/psypost-newsletter/" target="_blank" rel="noopener">Stay informed with the latest psychology and neuroscience research—sign up for PsyPost’s newsletter and get new discoveries delivered straight to your inbox.</a></em></strong></p>
<p>The researchers wondered if they could counteract the effects of Ftl-1 by supporting mitochondrial function directly. To test this, they gave mice a supplement called NADH, which plays a key role in ATP production during oxidative phosphorylation. In mice that had been genetically altered to overexpress Ftl-1, NADH supplementation improved both neuronal structure and memory performance. These animals, which had previously failed to show a preference for novel objects or maze arms, regained that ability after treatment.</p>
<p>At the molecular level, NADH appeared to rescue the neurons’ energy metabolism. RNA sequencing of neurons from treated mice revealed increased expression of genes involved in mitochondrial respiration and ATP synthesis, such as Sdhb, Atp5o, and Ndufa10.</p>
<p>Taken together, these findings indicate that Ftl-1 contributes to brain aging by disrupting iron homeostasis and energy metabolism, and that both removing this protein and supporting mitochondrial function can restore cognitive abilities.</p>
<p>“The most important takeaway is that cognitive impairment can be reversed, not just prevented or delayed,” Remesal explained. “Treating the aged brain might have more potential than we first thought.”</p>
<p>While the results are promising, the study was conducted entirely in mice. The extent to which these findings will translate to humans remains unknown. Ftl-1, or FTL in humans, performs a conserved role in iron storage, and mutations in the gene have been linked to a rare neurodegenerative disorder known as neuroferritinopathy. Directly targeting FTL in people would require careful evaluation of safety, as altering iron storage could have unintended consequences.</p>
<p>“This work was done in mice, so I’m wary of drawing hard translational lines to humans,” Remesal said. “Still, the biology is indeed shared between mouse and human and the protein plays the same core role. This gives us confidence that this protein therefore might be a therapeutic target worth pursuing.”</p>
<p>The authors are optimistic that their findings provide a foundation for future work. They note that iron dysregulation has already been implicated in Alzheimer’s disease and other neurodegenerative conditions. In fact, elevated ferritin levels in the cerebrospinal fluid have been shown to predict cognitive decline over time in people with mild cognitive impairment.</p>
<p>The study also supports a growing body of research suggesting that the aging brain remains plastic—that is, capable of recovering function. Targeting individual molecular pathways, even late in life, might not only slow cognitive decline but partially reverse it. This perspective marks a shift away from the idea that age-related memory loss is inevitable and irreversible.</p>
<p>The next steps for the team include investigating whether Ftl-1-targeted therapies can benefit mouse models of neurodegenerative diseases. They are also interested in examining how Ftl-1 is regulated during aging and whether its effects differ across brain regions. Ultimately, they hope to explore whether similar interventions could be developed for human use.</p>
<p>The study, “<a href="https://doi.org/10.1038/s43587-025-00940-z" target="_blank" rel="noopener">Targeting iron-associated protein Ftl1 in the brain of old mice improves age-related cognitive impairment</a>,” was authored by Laura Remesal, Juliana Sucharov-Costa, Yuting Wu, Karishma J. B. Pratt, Gregor Bieri, Amber Philp, Mason Phan, Turan Aghayev, Charles W. White III, Elizabeth G. Wheatley, Bende Zou, Brandon R. Desousa, Julien Couthouis, Isha H. Jian, Xinmin S. Xie, Yi Lu, Jason C. Maynard, Alma L. Burlingame, and Saul A. Villeda.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/some-neurocognitive-deficits-from-covid-19-may-last-for-years-study-suggests/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Some neurocognitive deficits from COVID-19 may last for years, study suggests</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 27th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>For people struggling with post-COVID “brain fog,” new findings suggest that cognitive recovery is possible, though it may take years. A large-scale study recently published in <em><a href="https://doi.org/10.1016/j.bbih.2025.101093" target="_blank">Brain, Behavior, & Immunity – Health</a></em> tracked cognitive changes over three and a half years and found significant improvements in most mental functions. Yet even with this progress, some participants continued to experience deficits in mental speed and flexible thinking.</p>
<p>Since the early days of the COVID-19 pandemic, patients have reported lingering symptoms such as mental slowness, forgetfulness, and trouble concentrating—often described collectively as “brain fog.” While several studies have examined the short-term impact of COVID-19 on brain function, little was known about how these symptoms evolve over several years.</p>
<p>Previous research has offered a mixed picture. Some studies have suggested that cognitive symptoms may improve within a year, while others point to longer-lasting effects, especially in people who were hospitalized. Many of these studies, however, relied on self-reported symptoms or brief online tests, and few used comprehensive in-person assessments with validated neuropsychological tools. Moreover, most did not include diverse or younger populations, making it harder to draw conclusions about how widespread or enduring these impairments might be.</p>
<p>To fill these gaps, researchers from the Mount Sinai Health System launched a long-term prospective study with a large, diverse sample of adults who had confirmed COVID-19 infections. Their goal was to track cognitive changes using validated testing methods and to determine which factors might influence the pace and extent of recovery.</p>
<p>“The focus of my research program pre-pandemic was on neurocognition in the context of chronic medical illnesses,” explained study author <a href="https://www.linkedin.com/in/jacqueline-h-becker-ph-d-1b778516/" target="_blank">Jacqueline H. Becker</a>, a clinical neuropsychologist and assistant professor of medicine at <a href="https://profiles.mountsinai.org/jacqueline-z-helcer" target="_blank">the Icahn School of Medicine at Mount Sinai</a>.</p>
<p>“Being at Mount Sinai, right at the epicenter of the pandemic, I witnessed firsthand the toll COVID-19 was taking on patients and families. At the time, no one anticipated something like Long COVID, but it was immediately clear that the effects of the pandemic would be long lasting, especially for brain health. I felt a strong responsibility to contribute to the recovery effort in some way, which eventually became a natural extension of my work.”</p>
<p>“At the inception of Mount Sinai Health System’s (MSHS) Post-COVID registry, we decided to include a neurocognitive battery to track the effects of infection on cognition over several years. This turned out to be crucial, as we became one of the first and only centers globally to collect objective, in-person neuropsychological measures as early as April 2020. That early work positioned us to better understand the long-term cognitive consequences of COVID-19 and how they evolve over time.”</p>
<p>The research team analyzed data from 1,553 participants in the Mount Sinai Post-COVID-19 Registry. These adults were recruited from a pool of patients who had tested positive for COVID-19 and received care at one of Mount Sinai’s facilities in New York City, starting in April 2020 and followed through January 2024.</p>
<p>Participants completed a comprehensive battery of neuropsychological assessments designed to measure various domains of cognition, including attention, working memory, verbal learning, memory recall, language fluency, processing speed, and executive functioning. These tests were administered in either English or Spanish, depending on the participant’s preference, and results were standardized to account for age, sex, and education.</p>
<p>“Unlike many prior investigations that relied on self-reported ‘brain fog’ or online testing, we used validated, in-person neuropsychological assessments, strengthening the validity of the findings,” Becker said.</p>
<p>Participants were tested once a year, up to four times, depending on when they joined the study. In addition to cognitive testing, they completed surveys about their medical history, mental health symptoms such as anxiety and depression, and fatigue levels. Other factors such as vaccination status, body mass index, and where participants received care during their initial infection (outpatient, emergency room, or inpatient) were also recorded.</p>
<p><strong><em><a href="https://www.psypost.org/psypost-newsletter/" target="_blank" rel="noopener">Stay informed with the latest psychology and neuroscience research—sign up for PsyPost’s newsletter and get new discoveries delivered straight to your inbox.</a></em></strong></p>
<p>To ensure the integrity of the results, the study excluded anyone with a prior diagnosis of cognitive problems or those who showed signs of suboptimal effort on performance validity tests.</p>
<p>At the start of the study—when participants were assessed within six months of recovering from COVID-19—many showed mild to moderate impairments across multiple cognitive areas. On average, their test scores were about half to one and a half standard deviations below the normative mean.</p>
<p>“Like earlier studies, our results confirm that cognitive impairment is common after COVID-19 and can persist well beyond the acute phase,” Becker told PsyPost.</p>
<p>Over the following 42 months, most cognitive domains showed signs of improvement. In particular, verbal learning and memory recall demonstrated the largest gains. For example, verbal learning scores improved from an initial average of 1.26 standard deviations below the mean to within the normal range by the end of the study period. Language abilities, such as phonemic and semantic fluency, also improved over time.</p>
<p>Processing speed and executive functioning showed more modest gains and were still below average after three and a half years. While there were measurable improvements, the average scores for these domains remained about 1.5 standard deviations below the norm, suggesting lingering impairments for some participants.</p>
<p>“Most earlier studies stopped at 12–24 months; our study was among the first to show trajectories out to 42 months (3.5 years), providing stronger evidence of both recovery and plateauing,” Becker said.</p>
<p>Interestingly, attention and working memory showed relatively little change over time. These areas were only mildly impaired at baseline, which may have left less room for measurable improvement.</p>
<p>“The most important takeaways are that, while cognitive recovery may be gradual, it is a very real possibility for most people with Long COVID,” Becker told PsyPost. “However, there may be a subset that remain with deficits even several years later. Overall, this suggests that people with Long COVID need more support for recovery.”</p>
<p>Participants who had a body mass index under 25—a range generally considered normal—tended to experience greater cognitive improvements. No other factors, including age, sex, vaccination status, or the severity of the initial COVID-19 infection, significantly influenced the pace of cognitive recovery.</p>
<p>“It was surprising that COVID-19 severity did not predict cognitive recovery trajectories and that lower BMI was the only factor associated with cognitive recovery over time,” Becker said.</p>
<p>The study offers one of the most comprehensive long-term assessments of cognitive recovery after COVID-19 to date. Still, the authors acknowledge some limitations. One key issue is that the sample may not fully represent all people with Long COVID. While the study recruited from a broad base, those with the most severe symptoms or those who had fully recovered might have been less likely to participate, potentially skewing the results.</p>
<p>The average participant was relatively young and well-educated, which could limit how generalizable the findings are to older adults or those with fewer cognitive or educational resources. It’s also possible that some of the observed improvements were partly due to practice effects, although the use of alternate test forms and annual spacing of assessments likely helped minimize this.</p>
<p>“Questions that remain unanswered are, what biological mechanisms (e.g., inflammation, vascular health, neuroplasticity) drive both persistence and recovery of deficits and which interventions (e.g., cognitive rehabilitation, lifestyle modification, medical therapies) can accelerate or optimize recovery,” Becker explained.</p>
<p>The researchers suggest that future studies could explore targeted interventions to promote recovery, such as cognitive rehabilitation or lifestyle changes. They also advocate for research into biological markers—such as inflammation levels—that might help predict who is most at risk for long-term impairment and who is likely to recover more quickly.</p>
<p>“Currently, I am working on developing and piloting interventions (including cognitive rehabilitation) to accelerate cognitive recovery in people with Long COVID,” Becker told PsyPost. “My long-term goals are to increase awareness of Long COVID and other infection-associated chronic conditions, inform health policy to better support affected individuals, and ultimately improve quality of life for patients through evidence-based care.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.bbih.2025.101093" target="_blank">Neurocognitive Trajectories in Long COVID: Evidence from Longitudinal Analyses</a>,” was authored by Jacqueline H. Becker, Jia Li, Jenny J. Lin, Alex Federman, Emilia Bagiella, Minal S. Kale MD, Daniel Fierer, Logan Bartram, and Juan P. Wisnivesky.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>