<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/psychologists-implant-false-beliefs-to-understand-how-human-memory-fails/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Psychologists implant false beliefs to understand how human memory fails</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 14th 2026, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in <a href="https://doi.org/10.1002/acp.70172"><em>Applied Cognitive Psychology</em></a> provides evidence that the types of false memories people form depend on how believable an event is and how often they are told it occurred. The findings suggest that highly plausible events are much more likely to generate false beliefs, but only when people are led to believe the event happened just once. These insights help clarify how suggestion can distort human memory in everyday situations and legal settings.</p>
<p>To understand the new study, it helps to distinguish between false beliefs and false memories. A false belief occurs when a person is confident that a specific event happened to them, even if they cannot visualize it. A false memory goes a step further and involves vivid, sensory details of an event that never actually took place, making it feel like a genuine recollection.</p>
<p>While memory is generally reliable, it is not perfect. It is reconstructive, meaning it tends to be malleable and prone to errors. When people are exposed to suggestive questions or misleading information, they can sometimes adopt false beliefs or false memories.</p>
<p>The new study was authored by Mara Georgiana Moldoveanu of Maastricht University, Babeș-Bolyai University and KU Leuven; Ahmad Shahvaroughi of KU Leuven; Ivan Mangiulli of KU Leuven and the University of Bari Aldo Moro; Javad Hatami of the University of Tehran; and Henry Otgaar of Maastricht University and KU Leuven.</p>
<p>The researchers sought to address a specific gap in memory research regarding event frequency. Past work suggested that telling someone an event happened repeatedly did not significantly change their likelihood of forming a false memory compared to telling them it happened only once. But no previous study had looked at how this suggested frequency interacts with the plausibility of the event itself.</p>
<p>Understanding this interaction has practical importance for legal and therapeutic environments. In real life, witnesses or victims are sometimes suggestively interviewed about repeated past abuses or highly implausible situations, such as ritualistic events. The researchers wanted to see if false memories for repeated or unlikely events could be reliably generated in a laboratory setting to better understand these real-world scenarios.</p>
<p>“Suggestion-based false beliefs and memories elicited are not just lab curiosities; real-world cases of false accusations or wrongful convictions sometimes involve suggestive therapy or interviews in which single, but also repeated abuse is sometimes suggested,” the researchers told PsyPost.</p>
<p>“So far, we are still investigating whether false memories for repeated events can be reliably elicited in the laboratory. Abuse reports may also involve less plausible events, such as satanic rituals. Such real-life cases ultimately represent the actual inspiration for this study and line of research, which is really aimed to be practically relevant besides interesting.”</p>
<p>To explore this, the researchers used a technique known as a blind implantation paradigm. In one of the typical memory paradigms, scientists often had to contact a participant’s parents to verify childhood events. The blind implantation method avoids this by relying entirely on the individual’s own initial answers to establish a baseline of what they believe to be true or false.</p>
<p>Initially, 855 participants from Western Europe, Romania, and Iran completed an online survey. In this first survey, participants read a list of 20 childhood events and indicated whether they had experienced them. Two of these items were critical events designed for the experiment.</p>
<p>One was a high-plausibility event, which was losing a toy as a child. The other was a low-plausibility event, which was almost drowning in the ocean. The scientists filtered the group to find people who stated they had never experienced the two critical events.</p>
<p>One week later, 103 of these eligible participants completed a follow-up survey. This final sample had an average age of 33.7 years, and 62.1 percent identified as women. In the second phase, the researchers presented each participant with a personalized list of five events.</p>
<p>Four were true events the participant had actually experienced, and one was the false critical event they had previously denied experiencing. The participants were randomly assigned to different groups where the false event was either highly plausible or less plausible.</p>
<p>Additionally, the researchers altered the suggested frequency of the false event. They told some participants the event happened once, while they told others it happened repeatedly during their childhood.</p>
<p>The final breakdown included 25 people in the highly plausible single group, 26 in the highly plausible repeated group, 30 in the low plausible single group, and 22 in the low plausible repeated group. The scientists asked the participants to rate their belief that the event occurred and their actual recollection of the event on an eight-point scale.</p>
<p>Next, the researchers instructed the participants to vividly imagine the event. The participants were asked to visualize the details, such as where it took place and who was there, and write down their thoughts. After this imagination exercise, the participants rated their belief and recollection a second time.</p>
<p>The data revealed a distinct interaction between event plausibility and suggested frequency for false beliefs. When the scientists suggested the event happened only once, the highly plausible event generated much higher false belief ratings than the low-plausibility event. Up to 52 percent of people in the highly plausible, single-occurrence group developed a false belief.</p>
<p>In comparison, only 10 percent of those in the low-plausibility, single-occurrence group formed a false belief. This suggests that plausibility plays a major role in shaping beliefs when an event is presented as an isolated incident. This difference vanished when the researchers suggested the event happened repeatedly.</p>
<p>In the repeated conditions, the plausibility of the event did not have a statistically significant effect on how strongly people believed it happened. The rate of false belief was 38.5 percent for the highly plausible repeated group and 22.7 percent for the low-plausibility repeated group. The lowest rate of false belief, at 9.1 percent, appeared before the imagination exercise in the group given the low-plausibility event that supposedly happened repeatedly.</p>
<p>The researchers suspect this interaction might relate to script theory, which involves a person’s general knowledge of how typical events unfold. When an event is described as happening repeatedly, it might activate existing knowledge about routine events, making it feel familiar regardless of its actual plausibility. On the other hand, if a person lacks vivid memories of a supposedly repeated event, they might reject the suggestion entirely.</p>
<p>When looking at false memories, which require actual sensory recollection, the overall rates were lower than those of false beliefs. Before the imagination exercise, false memory ratings did not differ significantly based on the plausibility of the event or the suggested frequency. False memory rates hovered between 9.1 percent and 16 percent across the different experimental groups.</p>
<p>After the participants were asked to imagine the event, an interaction emerged, but neither plausibility nor frequency alone showed a strong, independent effect on forming detailed false memories. These findings provide evidence that false beliefs are easier to implant when an event seems likely and is framed as an isolated incident.</p>
<p>The results highlight “that memory is not a recording video camera; it is generally reliable, but also malleable and prone to errors such as false beliefs (confidence that an event happened) and memories (vivid details that feel real),” the researchers said. “These could have real consequences in everyday life, but more so in legal contexts.”</p>
<p>“That does not mean we should distrust our memory and memories! It just means we have to consider context (e.g., suggestive questions) and factors such as event plausibility or suggested frequency when evaluating our memories, especially in court settings.”</p>
<p>“This study is a single investigation of the (interactive) effects of plausibility and suggested frequency on false beliefs and false memories,” the researchers added. “Are the obtained effects large enough for clear field implications? In false memory research, even a single detail change in testimony, such as believing the suspect had a red car, can have practical consequences, such as starting a criminal investigation.”</p>
<p>“We found practically relevant effects (e.g., for plausibility), but this meets only one criterion for field use, such as expert witness reports; generalizability (across ages, paradigms) and replicability (direct repeats) are still needed for confidence. Thus, the study advances the field, but requires cumulative evidence.”</p>
<p>As with any study, there are a few limitations to consider with this research. The final sample size of 103 participants was smaller than the researchers originally planned, which means the study might not have had enough statistical power to detect smaller effects. Some participants might also have genuinely experienced the critical events but forgotten them during the first survey, meaning they reported a true recovered memory rather than a newly implanted false one.</p>
<p>Future research will need to replicate these findings with larger groups to ensure they apply broadly across different populations. The scientists also plan to use events where the absolute truth is known to better separate false memories from forgotten true memories. Identifying the boundary conditions of false memory formation remains an ongoing project that requires cumulative evidence before being firmly applied in courtroom settings.</p>
<p>“This line of research is part of a broader scientific community studying false memories,” the researchers explained. “The main goals are to identify factors that influence false belief and false memory formation, and to develop ways to reduce false beliefs and memories and their potentially harmful consequences.”</p>
<p>“Another important aspect of this work is determining when the findings can be confidently applied in real-world settings, for example, in the courtroom. This specific project is also connected to research on self-deception, which may represent a form of motivated false belief. Several labs and researchers are investigating these timely topics.”</p>
<p>“For our work and that of our colleagues, you can visit our publications page here: <a href="https://celleuven.wixsite.com/home/publications" target="_blank" rel="noopener">https://celleuven.wixsite.com/home/publications</a>. You might also want to explore research from groups at universities such as the University of Chicago, Cornell University, University College Dublin, Amsterdam University, etc., that are doing work on (false) memory and belief.”</p>
<p>The study, “<a href="https://doi.org/10.1002/acp.70172" target="_blank" rel="noopener">The Effect of Plausibility and Suggested Event Frequency on the Implantation of False Beliefs and Memories</a>,” was authored by Mara Georgiana Moldoveanu, Ahmad Shahvaroughi, Ivan Mangiulli, Javad Hatami, and Henry Otgaar.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/terry-pratchetts-novels-held-clues-to-his-dementia-a-decade-before-diagnosis-new-study-suggests/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Terry Pratchett’s novels held clues to his dementia a decade before diagnosis, new study suggests</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 14th 2026, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>The earliest signs of dementia are rarely dramatic. They do not arrive as forgotten names or misplaced keys, but as changes so subtle they are almost impossible to notice: a slightly narrower vocabulary, less variation in description, a gentle flattening of language.</p>
<p>New <a href="https://www.mdpi.com/2076-3425/16/1/94">research</a> my colleagues and I conducted suggests that these changes may be detectable years before a formal diagnosis — and one of the clearest examples may lie hidden in the novels of Sir Terry Pratchett.</p>
<p>Pratchett is remembered as one of Britain’s most imaginative writers, the creator of the Discworld series and a master of satire whose work combined humour with sharp moral insight. Following his diagnosis of <a href="https://www.alzheimers.org.uk/about-dementia/types-dementia/posterior-cortical-atrophy">posterior cortical atrophy</a>, a rare form of Alzheimer’s disease, he became a powerful advocate for dementia research and awareness. Less well known is that the early effects of the disease may already have been present in his writing long before he knew he was ill.</p>
<p>Dementia is often described as a condition of memory loss, but this is only part of the story. In its earliest stages, dementia can affect attention, perception and language before memory problems become obvious. These early changes are difficult to detect because they are gradual and easily mistaken for stress, ageing or normal variation in behaviour.</p>
<p>Language, however, offers a unique window into cognitive change. The words we choose, the variety of our vocabulary and the way we structure description are tightly linked to brain function. Even small shifts in language use may reflect underlying neurological change.</p>
<p>In our recent study, we analysed the language used across Terry Pratchett’s Discworld novels, examining how his writing evolved over time. We focused on “lexical diversity” — a measure of how varied an author’s word choices are — and paid particular attention to adjectives, the descriptive words that give prose its texture, colour and emotional depth.</p>
<p>Across Pratchett’s later novels, there was a clear and statistically significant decline in the diversity of adjectives he used. The richness of descriptive language gradually narrowed. This was not something a reader would necessarily notice, nor did it reflect a sudden deterioration in quality. Instead, it was a subtle, progressive change detectable only through detailed linguistic analysis.</p>
<p>Crucially, the first significant drop appeared in The Last Continent, published almost ten years before Pratchett received his formal diagnosis. This suggests that the “preclinical phase” of dementia — the period during which disease-related changes are already occurring in the brain — may have begun many years earlier, without obvious outward symptoms.</p>
<p>This finding has implications that extend far beyond literary analysis. Dementia is known to have a long preclinical phase, during which opportunities for early intervention are greatest. Yet identifying people during this window remains one of the biggest challenges in dementia care.</p>
<p>Linguistic analysis is not a diagnostic tool in itself, and it would not work equally well for everyone. Factors such as education, profession, writing habits and linguistic background all influence how people use language. But as part of a broader approach — alongside cognitive tests, brain imaging and biological markers — language analysis could help detect early risk in a non-invasive and cost-effective way.</p>
<p>Importantly, language data already exists. People generate vast amounts of written material through emails, reports, messages and online communication. With appropriate safeguards for privacy and consent, subtle changes in writing style could one day help flag early cognitive decline long before daily functioning is affected.</p>
<h2>Why early detection matters</h2>
<p>Early detection matters more than ever. In recent years, new drugs for Alzheimer’s disease have emerged that aim to slow disease progression rather than simply manage symptoms. Drugs such as <a href="https://www.nejm.org/doi/full/10.1056/NEJMoa2212948">lecanemab</a> and <a href="https://jamanetwork.com/journals/jama/fullarticle/2807533">donanemab</a> target amyloid proteins that accumulate in the brain and are thought to play an important role in the disease. Clinical trials suggest these treatments would be most effective when given early, before significant neuronal damage has occurred.</p>
<p>Identifying people during the preclinical phase would allow people and their families more time to plan, access support and consider interventions that may help slow progression. These may include lifestyle changes, cognitive stimulation and, increasingly, new drugs to slow the disease progression.</p>
<p>More than a decade after his death, Terry Pratchett continues to contribute to our understanding of dementia. His novels remain deeply loved, but hidden within them is another legacy: evidence that dementia may leave its mark long before it announces itself. Paying closer attention to language — even language we think we know well — could help transform how we detect, understand and ultimately treat this devastating condition.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/273777/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/terry-pratchetts-novels-may-have-held-clues-to-his-dementia-a-decade-before-diagnosis-our-new-study-suggests-273777">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/women-who-are-open-to-sugar-arrangements-tend-to-show-deeper-psychological-vulnerabilities/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Women who are open to “sugar arrangements” tend to show deeper psychological vulnerabilities</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 14th 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the <em>Archives of Sexual Behavior</em> suggests that young women who are open to “sugar relationships” tend to experience deeper psychological vulnerabilities, such as difficulties with emotional coping and relationship skills. The research provides evidence that an acceptance of trading intimacy for material benefits is often linked to negative childhood experiences that shape how a person views themselves and others.</p>
<p>Sugar relationships involve an arrangement where companionship or sexual intimacy is exchanged for resources like money or gifts. Public discussions about these arrangements tend to focus heavily on the financial or ethical aspects of the exchange. The authors of the new study wanted to look beyond the surface to understand the underlying emotional and cognitive patterns that make someone receptive to this type of dating.</p>
<p>“Research on sugar relationships and other forms of sexual–economic exchange has grown rapidly in recent years. Many studies have reported that women involved in these relationships tend to show higher levels of emotional insecurity, relational difficulties, or vulnerabilities in personality functioning,” said study author <a href="https://www.meskonorbert.com/" target="_blank" rel="noopener">Norbert Meskó</a>, a professor at the University of Pécs.</p>
<p>“But an important question has remained largely unanswered: are these psychological characteristics consequences of these relationships, or could they already be present beforehand? Our study approached the issue from a different angle. Instead of focusing only on women who are already involved in sugar relationships, we examined women’s openness to such relationships.”</p>
<p>“We tested a model suggesting that openness to sugar relationships might be associated with early relational experiences, emotion regulation patterns, and personality functioning. In other words, we wanted to explore whether some of the psychological patterns observed in previous studies might partly precede involvement in these relationships rather than simply result from them.</p>
<p>The researchers gathered data from 500 young Hungarian women between the ages of 18 and 35. This group was specifically chosen to represent the broader population of Hungary in terms of education level, geographic region, and the type of community where they lived. Participants completed a series of validated online questionnaires in December 2024.</p>
<p>The surveys measured the participants’ general openness to sugar relationships, rather than whether they had actually engaged in one. The scientists chose this approach to capture a broad psychological attitude that exists independently of actual behavior. This method helps reveal the mental mechanisms that shape receptivity to transactional intimacy across the general public.</p>
<p>The scientists also assessed the presence of early maladaptive schemas in the participants. These schemas are deeply ingrained negative beliefs about oneself and the world, which typically develop in childhood when basic emotional needs are neglected. People with these schemas often harbor intense fears of abandonment, emotional deprivation, or social rejection.</p>
<p>Additionally, the researchers measured personality functioning. This concept refers to an individual’s capacity to maintain a clear sense of identity and build stable, mutually supportive relationships. A person with impaired personality functioning might struggle with setting goals, feeling empathy, or tolerating emotional closeness.</p>
<p>Finally, the surveys evaluated cognitive emotion regulation, which refers to the specific mental habits people use to handle stress and negative feelings. Some strategies are adaptive, like looking for solutions or finding a positive perspective. Other strategies are maladaptive, such as obsessing over a problem, expecting the absolute worst, or constantly blaming oneself.</p>
<p>The data revealed that women who reported higher openness to sugar relationships tended to show greater impairments in their general personality functioning. They also relied more heavily on unhelpful emotion regulation strategies to manage their distress. Healthy emotional coping strategies showed no link to an acceptance of sugar dating at all.</p>
<p>The researchers found that early maladaptive schemas indirectly influenced attitudes toward sugar relationships. Women with stronger negative childhood schemas were more likely to struggle with self-identity and emotional regulation as adults. These present-day struggles then predicted a greater willingness to consider transactional dating arrangements.</p>
<p>“What was particularly interesting was how consistently different psychological domains—early relational experiences, emotion regulation, and personality functioning—contributed to the same model,” Meskó told PsyPost. “While earlier studies had examined some of these variables separately, seeing them operate together within a single statistical framework strengthened the interpretation that openness to sugar relationships may reflect a broader psychological pattern rather than a single isolated factor.”</p>
<p>The researchers noted that individuals who lack effective emotional coping strategies are often more likely to adopt external behaviors to manage their internal distress. Because cognitive distortions rooted in early childhood amplify emotional instability, these individuals might prioritize short-term relief or a sense of control. In this context, the financial or material rewards of a sugar relationship might serve as a coping mechanism.</p>
<p>This pattern suggests that sugar relationships might appeal to some individuals because the clear, negotiated boundaries offer a sense of safety. For people who find deep emotional intimacy confusing or overwhelming, an exchange-based arrangement might feel easier to manage. The structured nature of transactional dating may provide an alternative way to experience connection without the emotional risks of a traditional romantic partnership.</p>
<p>“One takeaway is that attitudes toward unconventional relationship forms may reflect broader psychological and developmental experiences,” Meskó explained. “Our findings suggest that openness to sugar relationships is associated with earlier relational experiences, differences in how people manage emotions, and aspects of personality functioning.”</p>
<p>“These factors do not determine relationship choices, but they may influence how individuals perceive and evaluate different types of relationships. Human relationship decisions rarely emerge in isolation. They are often shaped by a person’s life history, emotional patterns, and social environment.”</p>
<p>A primary limitation of this research is its correlational design. The statistical links observed in the data show that these psychological factors are related to openness to sugar relationships, but they do not prove that one causes the other. Future work is required to establish a direct chain of cause and effect.</p>
<p>“Another potential misunderstanding would be to interpret the findings as applying to every individual,” Meskó noted. “Relationship choices are highly diverse, and people enter sugar relationships for many different reasons. Our results describe general tendencies in a sample, not deterministic pathways for individuals.”</p>
<p>“The effects we observed were moderate in size, which is typical in psychological research involving complex social behavior. Relationship attitudes are influenced by many interacting factors—cultural norms, economic conditions, personal values, and psychological characteristics.”</p>
<p>“Rather than identifying a single decisive predictor, our findings highlight a pattern of associations. The practical significance lies in improving our understanding of the psychological background of sexual–economic relationships, not in predicting individual behavior with certainty.</p>
<p>Looking ahead, the scientists plan to conduct longitudinal studies to track individuals over an extended period. Observing people over the years could help verify whether these psychological patterns truly precede an interest in transactional dating. They also hope to pursue cross-cultural research to see how different social norms and economic conditions shape attitudes toward exchanging intimacy for material gain.</p>
<p>“For about six years now, our research group has been studying psychological openness to sexual–economic exchange,” Meskó explained. “One of our broader goals is to develop a more comprehensive psychological framework explaining why some individuals are more open to these types of relationships than others.”</p>
<p>“Public discussions about sugar relationships are often framed primarily in moral, cultural, or economic terms. Psychological research can add another important perspective by examining the developmental and emotional factors that may shape relationship preferences.”</p>
<p>“Understanding these processes does not mean endorsing or condemning particular relationship forms,” Meskó concluded. “Rather, it allows us to approach complex human behaviors with greater nuance. As our findings suggest, openness to certain relationship forms may sometimes reflect deeper psychological and life-history factors that deserve careful and empathetic examination.”</p>
<p>The study, “Openness to “Sugar Relationships” Reflects Personality and Emotional Vulnerabilities in a Representative Sample of Hungarian Women,” was authored by Norbert Meskó, Béla Birkás, and András N. Zsidó.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/ashwagandha-shows-promise-as-a-treatment-for-depression-in-new-rat-study/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Ashwagandha shows promise as a treatment for depression in new rat study</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 13th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Adolescent rats exposed to chronic stress showed fewer signs of depression after receiving the herbal supplement Ashwagandha, according to new research published in <em><a href="https://doi.org/10.1007/s00213-025-06844-5" target="_blank">Psychopharmacology</a></em>. The study found that the herb not only eased behavioural symptoms but also reduced inflammation and cell damage in the brain—effects that in some cases surpassed those of the antidepressant sertraline.</p>
<p>Depression during adolescence is becoming increasingly common, yet treatment options remain limited and often come with side effects. Scientists have long known that chronic stress can disrupt brain function by increasing inflammation, damaging nerve cells, and lowering levels of proteins that support healthy brain activity. These biological changes are thought to contribute to the emotional and cognitive symptoms seen in depression.</p>
<p>Ashwagandha, a plant widely used in traditional Indian medicine, has gained attention for its stress-relieving properties. Previous studies in adults and animal models suggest it may help regulate the body’s stress response and protect brain cells. However, until now, no research had examined whether Ashwagandha could help younger individuals, whose developing brains may respond differently to both stress and treatment.</p>
<p>Led by Gul Sahika Gokdemir from Mardin Artuklu University, Turkey, researchers worked with 28 adolescent male rats. Most of the rats were exposed to a 17-day protocol of unpredictable mild stressors—such as wet bedding, overnight food deprivation, and brief restraint—designed to mimic the unpredictable pressures that can contribute to depression. </p>
<p>To properly assess the treatments, some of these stressed rats were left untreated to serve as a depressed baseline, while others received daily oral doses of either sertraline, a commonly prescribed antidepressant, or Ashwagandha. A control group experienced no stress at all.</p>
<p>The team then assessed the animals using standard behavioural tests. Untreated stressed rats showed classic signs of depression: they drank less sweetened water (a measure of reduced pleasure), spent more time immobile during a forced swim test (a sign of despair-like behaviour), and displayed more anxiety-like responses in a maze test. Both sertraline and Ashwagandha improved the treated rats’ performance on the pleasure and despair measures, suggesting strong antidepressant-like effects. Anxiety levels improved slightly but not significantly.</p>
<p>Beyond behaviour, the researchers examined the rats’ brain tissue. Stress sharply increased levels of inflammatory molecules (like TNF-α) and proteins associated with cell death (Bax and Caspase-3). It also reduced levels of BDNF, a protein essential for healthy brain function, and lowered the number of supportive glial cells, known as astrocytes.</p>
<p>Ashwagandha reduced the cell-death markers and inflammation much more effectively than sertraline, returning them to levels similar to the healthy controls, and successfully restored glial cell levels. However, the researchers noted that while Ashwagandha profoundly protected against cell death, its ability to restore the depleted BDNF levels was only borderline. Additionally, the herb prevented the weight loss typically seen in stressed animals, a protective effect that sertraline did not match.</p>
<p>Microscopic examination of the brain revealed that stressed rats showed swelling and structural disruption in the fronto-parietal cortex.</p>
<p>“We focused on the fronto-parietal cortex due to its role in cognitive and emotional processes. This region is involved in attention regulation, decision-making, and emotional control processes and is frequently impaired during depression,” Gokdemir and team noted.</p>
<p>These abnormalities were noticeably reduced in the Ashwagandha-treated group, whose brain tissue more closely resembled that of the healthy controls.</p>
<p>While the findings are promising, the authors caution that the study was conducted only in male adolescent rats, and results may differ in females or humans due to hormonal differences. They also note that the brain region examined was broad, and future work should look at specific subregions to better understand exactly how Ashwagandha exerts its neuroprotective effects.</p>
<p>The study, “<a href="https://doi.org/10.1007/s00213-025-06844-5" target="_blank">Antidepressant-like effects of Ashwagandha (Withania somnifera) on chronic unpredictable mild stress-induced depression in adolescent rats</a>,” was authored by Gul Sahika Gokdemir, Ugur Seker, Nazan Baksi, Mukadder Baylan, Berjan Demirtaş, and Mehmet Tahir Gokdemir.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/early-exposure-to-a-high-fat-diet-alters-how-the-adult-brain-reacts-to-junk-food/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Early exposure to a high-fat diet alters how the adult brain reacts to junk food</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 13th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Exposure to a diet heavy in fats and sugars during early development primes the brain to overreact to unhealthy foods in adulthood. This combination leads to high levels of inflammation and reduced adaptability within the brain’s main memory center. These molecular changes suggest that early nutritional environments have long-lasting effects on cognitive health, according to a recent study published in <em><a href="https://doi.org/10.1080/1028415x.2025.2600516" target="_blank">Nutritional Neuroscience</a></em>.</p>
<p>The physical makeup of the brain is not set in stone at birth. It constantly changes and adapts in response to life experiences. This feature of the nervous system is known as neural plasticity.</p>
<p>Neural plasticity allows humans and animals to form new memories, learn new skills, and recover from physical injuries. To function properly, the brain relies on specific proteins that act as fertilizer for neural connections.</p>
<p>One of these vital proteins is a growth factor that helps neurons survive and communicate. When the brain is healthy, these growth factors bind to specific receptors on the outside of brain cells. This continuous chemical dialogue allows the nervous system to adapt to new environments.</p>
<p>This binding process triggers a cascade of signals that lock in new memories. If this signaling system breaks down, the brain loses its ability to maintain healthy synapses.</p>
<p>Synapses are the tiny gaps where neurons pass chemical messages to one another. Diet plays a major role in maintaining this delicate cellular environment over a lifetime. Foods consumed on a daily basis provide the raw materials for these chemical exchanges.</p>
<p>Diets high in fats and sugars, often called Western diets, can trigger an immune response in the body and the brain. When immune cells in the brain detect tissue stress from poor nutrition, they release inflammatory messenger proteins.</p>
<p>Chronic inflammation in the nervous system is a known driver of brain aging. It can also lead to neurodegeneration, a condition where brain cells slowly lose function and die.</p>
<p>Biologists suspect that poor nutrition during pregnancy and nursing might permanently alter how a newborn regulates these inflammatory and growth pathways. This underlying concept is called metabolic programming.</p>
<p>Metabolic programming means that the nutritional environment a fetus experiences can rewrite its biological software. This sets a physiological baseline for how its body will react to food later in life.</p>
<p>Researchers wanted to know exactly how early exposure to a Western diet shapes the brain’s long-term vulnerability to future dietary insults. They designed an experiment to see if a brief taste of junk food in adulthood would trigger different molecular reactions in animals exposed to poor diets before birth.</p>
<p>The research team was led by Rhowena J.B. Matos, a postdoctoral scholar at the Federal University of Bahia and a professor at the Federal University of Recôncavo of Bahia. Matos collaborated with a team of researchers specializing in physical education, physiotherapy, and nutrition.</p>
<p>To answer their research question, the team studied albino laboratory rats. They divided pregnant and nursing rats into two distinct groups to control their nutritional intake. This animal model allows scientists to track biological changes across an entire lifespan in a compressed timeframe.</p>
<p>One group received a standard, balanced laboratory diet. The other group was fed a custom Western diet designed to mimic typical human junk food. This Western diet was heavily modified with ingredients from the local market. It included lard, butter, sugar, and barbecue flavor fries to increase the fat and carbohydrate content.</p>
<p>Once the rat pups were weaned, they were all placed on the exact same standard, healthy diet. They ate this balanced food for over six months, reaching full adulthood without any further exposure to high-fat foods.</p>
<p>When the rats reached 195 days of age, the researchers introduced a new variable. They took half of the rats from the standard early-life diet and half from the Western early-life diet to undergo a new feeding protocol.</p>
<p>These selected adult rats were allowed to eat the Western diet for just two hours a day for five consecutive days. The remaining rats continued their standard healthy diet without any interruption.</p>
<p>At the end of the five days, the scientists examined the animals’ blood and brain tissue. They focused specifically on the hippocampus, a seahorse-shaped structure deep in the brain. This technique was chosen to avoid unnecessary cellular damage during the extraction process.</p>
<p>The hippocampus is the primary brain region responsible for spatial learning and memory consolidation. The researchers measured gene expression within this specific brain tissue.</p>
<p>Gene expression is the process by which a cell reads the instructions in its DNA to build specific molecules. In this case, researchers looked for the genetic instructions used to build inflammatory markers and structural proteins.</p>
<p>The initial results showed that the early-life diet left a lasting metabolic imprint on the animals. Adult rats whose mothers ate the Western diet had higher blood glucose and total protein levels than the control group. These elevated levels persisted even after months of eating a healthy diet. In the brain, the baseline effects of the early Western diet were somewhat unexpected.</p>
<p>Before any adult dietary stimulation, these rats actually showed lower expression of several inflammatory genes in the hippocampus compared to the control group. The researchers suspect this initial suppression might be a compensatory adaptation.</p>
<p>The developing brain may have turned down its inflammatory pathways to protect itself from the stress of the maternal junk food diet. However, this apparent protection disappeared when the adult rats were briefly exposed to the Western diet again. The underlying vulnerability of the brain was finally exposed by the new environmental stressor.</p>
<p>The short five-day burst of junk food caused a massive spike in inflammatory gene expression in the rats with the early-life Western diet background. The production instructions for two major inflammatory messenger proteins more than doubled in these animals.</p>
<p>In contrast, the rats whose mothers ate a healthy diet did not experience this extreme inflammatory spike after the brief adult junk food exposure. The early-programmed rats also exhibited a sharp increase in blood cholesterol levels following the short junk food exposure.</p>
<p>However, the results for blood triglycerides and albumin levels were not statistically significant. This indicates that the dietary changes targeted specific metabolic pathways rather than causing an across-the-board increase in all measurable blood markers.</p>
<p>The team discovered unusual changes in the genes controlling brain adaptability as well. Following the brief adult exposure to the Western diet, the early-programmed rats showed increased expression of the main neural growth factor.</p>
<p>Yet, the genes responsible for building the receptors that actually receive this growth factor were severely downgraded. Another gene responsible for processing memory consolidation was also suppressed by about one third. This opposing reaction created a severe bottleneck in the brain’s cellular communication network.</p>
<p>This means that while the brain was trying to pump out more growth factor, the receiving cells were essentially shutting their doors. The signaling pathway required for healthy neural plasticity became fundamentally disconnected.</p>
<p>The researchers think this broken signaling pathway could lead to serious cognitive deficits. They suggest that these molecular changes might limit how the hippocampus builds and retains memories over time.</p>
<p>The authors noted a few limitations to their current experimental design. The study only examined male rats, which means the results cannot account for potential hormonal differences in females.</p>
<p>Estrogen variations in female animals can heavily influence neural plasticity in the hippocampus. Future studies will need to include both sexes to provide a more complete picture of dietary programming.</p>
<p>Additionally, measuring gene expression only reveals the brain’s blueprint for making proteins. It does not measure the final amount of protein that is actually produced and utilized by the cells. The researchers plan to track exact protein levels in future experiments. This step is necessary to confirm that these cellular pathways are entirely disrupted in the physical brain tissue.</p>
<p>The current study also did not measure the animals’ actual cognitive abilities. Upcoming research will need to incorporate practical behavioral tests. Watching how the animals navigate physical challenges will provide a broader view of brain health. Scientists could use water mazes or object recognition tasks to evaluate the animals in real time. These tests would verify if these molecular changes truly cause memory and learning failures in living subjects.</p>
<p>The study, “<a href="https://doi.org/10.1080/1028415x.2025.2600516" target="_blank">Western diet during gestation and lactation alters hippocampal gene expression and responds to acute dietary stimulation in adulthood</a>,” was authored by Rhowena J.B. Matos, Odair J.F. Lima, Juliana S. Ribeiro, Taynara R.L. Silva, Mireia C.M. Conceição, Mírian C.M.M. David, Tercya L.A. Silva, Elizabeth do Nascimento, and Jairza M.B. Medeiros.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/how-sexual-orientation-stereotypes-keep-men-out-of-early-childhood-education/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">How sexual orientation stereotypes keep men out of early childhood education</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 13th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the <em><a href="https://doi.org/10.1111/jasp.70051" target="_blank" rel="noopener">Journal of Applied Social Psychology</a></em> suggests that young men hold distorted views about the level of interest other men have in early childhood education and care careers. The findings provide evidence that sexual orientation stereotypes and a misunderstanding of peer beliefs continue to reinforce the lack of men in caregiving roles.</p>
<p>“Worldwide, men are extremely underrepresented in early childhood education and care. This is problematic for many reasons, but one of the main issues is that it reinforces the idea of caring as ‘women’s work’ (i.e., something that men cannot, and should not, do) and thus hinders efforts towards gender equality,” said study author <a href="https://www.ost.ch/de/person/serena-haines-11827" target="_blank" rel="noopener">Serena Haines</a>, a postdoctoral researcher at the Competence Centre for Mental Health at the OST—Eastern Switzerland University of Applied Sciences.</p>
<p>“Greater diversity in early childhood educators and carers offers children a broader range of social experiences and modes of learning, which can provide cognitive and emotional benefits. It also shows children from a young age that caring and empathy are not gender-specific qualities.”</p>
<p>“The lack of men working in child care is usually investigated from the perspective of people working in the field; very little research has covered how men perceive the profession from the outside—as a prospective career for themselves or other men. This was the gap our project addressed.”</p>
<p>Specifically, the researchers wanted to explore a concept called pluralistic ignorance. Pluralistic ignorance occurs when individuals misperceive the actual beliefs or norms of their own group. This misunderstanding often causes people to hide their true preferences to conform to a social rule that does not actually exist.</p>
<p>The team designed this study to see if young men misperceive the actual career interests of their peers based on whether those peers are gay or straight. To investigate this, the researchers recruited young men living in the United States between the ages of 18 and 30. After excluding incomplete responses, the final sample consisted of 334 men.</p>
<p>This sample included 174 gay men and 160 straight men. Participants were randomly assigned to answer questions about one of three specific target groups. They were asked to consider either themselves, gay men in general, or straight men in general.</p>
<p>First, participants rated how interested the assigned group would be in working in childcare on a scale from zero to 100. Next, the participants answered two open-ended questions to explore their reasoning. They were asked to list up to five factors that would make the assigned group less interested in childcare work, which the researchers classified as barriers.</p>
<p>Then, they listed up to five factors that would make the group more interested, which were classified as motivators. The scientists then coded these responses into twelve distinct categories for analysis.</p>
<p>The researchers found that both gay and straight men overestimated the interest of gay men in childcare careers. When straight men rated straight men, their estimates matched the actual low interest reported by straight participants. However, gay men significantly overestimated the interest of their own group, demonstrating pluralistic ignorance.</p>
<p>When looking at the open-ended responses, participants listed significantly more barriers than motivators. Across all groups, the most common barriers were practical concerns like low salaries and poor working conditions. The most common motivators were positive interactions with children and the potential for a better salary.</p>
<p>To understand how easily men could think of negatives versus positives, the researchers calculated a difference score for each participant. They subtracted the total number of motivators from the total number of barriers generated. They found that barriers were much easier for participants to call to mind, especially when they were asked to think about straight men as a group.</p>
<p>The data also provides evidence of deep-seated sexual orientation stereotyping. Participants tended to believe that straight men were primarily deterred by gender stereotypes and the desire to appear traditionally masculine. On the other hand, participants believed that gay men were primarily deterred by the fear of negative evaluations, prejudice, and social suspicion.</p>
<p>Similar stereotyping emerged when participants listed motivators. Participants assumed that gay men were motivated by a natural desire to nurture or by difficulties in having biological children. At the same time, participants assumed that straight men were motivated mostly by structural improvements like better pay.</p>
<p>The researchers also noticed distinct patterns of pluralistic ignorance in how men viewed these barriers and motivators. For instance, straight men believed that the need to appear masculine discouraged other straight men much more than it discouraged themselves. Gay men believed that the desire to challenge gender roles motivated other gay men more than it motivated themselves.</p>
<p>Both groups were more likely to cite practical barriers for themselves but assumed stigma and prejudice were bigger barriers for their peers. Originally, the scientists predicted that gay men would report lower personal interest in childcare because of the historical and false cultural association between homosexuality and pedophilia. The self-reported data did not support this hypothesis, as gay men did not list this fear as a personal barrier, though they did assume it was a barrier for other gay men.</p>
<p>The scientists suggest that interventions are needed to diversify the early childhood workforce. Beyond improving working conditions, they recommend challenging the gendered and sexualized meanings attached to caregiving. Men’s career decisions are shaped not only by financial concerns but also by the threat of social judgment.</p>
<p>“Reflecting the reality that men are underrepresented in the field, men in our study were not particularly interested in working in child care,” Haines told PsyPost. “But, the reasons they gave for this lack of interest differed depending on whether they were talking about themselves or other men. Men tended to list practical barriers for themselves (like it being difficult to work with children), but stigma-related barriers for other men (like being treated with suspicion). This suggests that men likely misperceive what other men actually think about working in child care.”</p>
<p>“Men—regardless of their own sexual orientation—tended to think that gay men were more interested in child care work. When listing reasons for why men might be interested in working in childcare, men assumed that gay men would be motivated by being more nurturing and caring, while straight men would do it if it meant they could get better pay or benefits. This suggests that sexual orientation stereotypes influence how men think about other men’s interest in childcare work.”</p>
<p>“Overall, our findings suggest that stereotypes and inaccurate beliefs about social norms shape how men think about themselves and other men working in child care.”</p>
<p>While the study offers detailed insights, the researchers note a few limitations to keep in mind. The sample included only young men from the United States, meaning the results might not apply to men in different cultural environments. Cultural contexts heavily influence how masculinity and career norms are perceived.</p>
<p>Additionally, the researchers relied on single-item measures and open-ended questions to capture complex attitudes. Future research could use more extensive surveys with multiple items to measure interest and barriers more precisely. It might also be useful to have participants rank the importance of each barrier rather than just listing them.</p>
<p>Moving forward, the scientists suggest investigating these themes in younger populations, such as high school students who are just beginning to explore career options. They also plan to study how new ideas of masculinity might help young men transition into adulthood.</p>
<p>“I’m currently working on <a href="https://www.ost.ch/en/details/projects/masc-sg-masculinities-and-systems-of-care-in-st-gallen-new69709c0c52daa957116636" target="_blank" rel="noopener">a project with young men in the east of Switzerland</a> to better understand what masculinity means to them, and to what extent caring masculinities—masculinities that deemphasize dominance and aggression and emphasize caring—factor into their relationships with others,” Haines said. “The goal is to build up a picture of how young men see themselves and find ways to better support their transition into adulthood.”</p>
<p>The study, “<a href="https://doi.org/10.1111/jasp.70051" target="_blank" rel="noopener">Motivations and Barriers to Men’s Interest in Childcare: The Role of Norm Perception and Sexual Orientation Stereotyping</a>“, was authored by Serena Haines, Peter Hegarty, Christa Nater, Sylvie Graf, and Sabine Sczesny.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/your-personality-and-upbringing-predict-if-you-will-lean-toward-science-or-faith/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Your personality and upbringing predict if you will lean toward science or faith</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 13th 2026, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>When navigating life’s biggest questions, adults often lean on scientific evidence, religious faith, or a mixture of both to understand the world around them. A survey of American adults reveals that specific childhood experiences and distinct personality traits predict whether a person will eventually view reality through a scientific lens or a religious one. These results, published in the journal <em><a href="https://doi.org/10.1080/2153599X.2024.2363768" target="_blank">Religion, Brain & Behavior</a></em>, help explain how early household environments shape our lifelong philosophical frameworks.</p>
<p>Humans possess a deep psychological need to comprehend themselves and their place in the universe. To achieve this, people construct worldviews, which the researchers describe as “assumptions about physical and social reality that may have powerful effects on cognition and behavior.” Religion and science serve as two primary methods for making sense of everyday life.</p>
<p>Sense-making is the psychological process of giving meaning to life experiences. When someone experiences a major life event, they need a mental framework to process why it happened. Religion provides a metaphysical lens rooted in faith, spirituality, and belief in the divine to address moral values and existential mysteries.</p>
<p>Science offers a different approach, using systematic observation, logical reasoning, and physical evidence to explain the natural environment. While the two systems use distinct methods, both give people a way to interpret the events happening around them.</p>
<p>Researchers wanted to understand exactly how background factors guide people toward these two different frameworks. Previous psychological studies have focused heavily on general religious behaviors, such as how often someone attends a worship service. Few researchers have explored religion and science as parallel, overarching lenses for making sense of reality.</p>
<p>To fill this gap, a team of psychologists at the University of Connecticut designed an investigation to look at the developmental roots of these worldviews. Crystal L. Park led the study alongside her colleagues Adam B. David, Jeffrey D. Burke, and Lisa Annunziato. They suspected that both the environment a person grows up in and their inherent personality traits play a role in shaping their ultimate worldview.</p>
<p>Park and her team recruited 300 adults from across the United States. The researchers used an online platform to ensure the participants represented a diverse cross-section of the country regarding age, gender, and race. Participants completed a survey that took about half an hour to finish.</p>
<p>The survey asked respondents to reflect on their childhood environments and the behaviors of the people who raised them. It included the Science and Faith Mindsets Scale, which asks people how much they agree with statements about trusting science or a deity to solve humanity’s major problems. The researchers then used mathematical models to look for patterns linking background factors to the participants’ current reliance on religion and science.</p>
<p>The survey also included standardized psychological tests to evaluate the participants’ current personality traits. The tests measured a framework known as the Big Five personality traits. These traits include extraversion, agreeableness, openness to experience, conscientiousness, and emotional stability.</p>
<p>The researchers also measured a person’s preference for authoritarianism, which favors strict obedience to authority at the expense of personal freedom. They evaluated critical thinking skills as well. This assessment involved solving logic puzzles where the intuitive answer is usually wrong, requiring the participant to pause and reflect.</p>
<p>The results showed that childhood exposure to specific behaviors shaped adult beliefs in profound ways. When parents or caregivers actively demonstrated their faith through tangible actions, their children grew up to rely heavily on religion to make sense of the world. Psychologists call these actions credibility-enhancing displays.</p>
<p>Credibility-enhancing displays include visible commitments like doing religious charity work or attending volunteer events. Merely growing up in a household that claimed to value a religious tradition did not predict an adult’s reliance on religion. The tangible religious actions of the caregivers were the deciding factor in whether a child adopted a lifelong religious lens.</p>
<p>At the same time, high levels of these visible religious behaviors during childhood predicted a lower reliance on science in adulthood. A different pattern emerged for scientific worldviews. When caregivers provided direct opportunities for science learning during childhood, those children grew up to rely more on science.</p>
<p>These childhood science opportunities included activities like taking kids to museums or showing interest in their scientific questions. Encouraging science learning during childhood did not reduce a person’s adult reliance on religion. This detail suggests that a strong scientific worldview requires deliberate childhood engagement with science, but it does not crowd out religious faith.</p>
<p>Parents who want to encourage scientific curiosity need not worry that museum trips will erode their child’s religious worldview. The two frameworks are not mutually exclusive, and many people integrate both into their daily lives. Even though people can hold both views, the researchers found an inverse relationship between the two frameworks in their overall sample.</p>
<p>Higher reliance on one system was generally associated with a lower reliance on the other. This suggests that while individuals can blend these perspectives, they usually lean toward one preferred method of sense-making. Beyond childhood environments, the researchers found that innate personality traits influence worldviews.</p>
<p>Agreeableness, which describes a person who is cooperative, empathetic, and considerate of others, predicted a stronger reliance on both religion and science. This was the only personality trait that pushed people toward both frameworks at the same time. The researchers suspect that making sense of the world through a specified system of beliefs appeals to people who prioritize group harmony.</p>
<p>Highly agreeable people might naturally gravitate toward structured belief systems that emphasize collective understanding. This focus on the needs of others fits well with both organized religion and the collaborative nature of science. Other personality traits predicted reliance on just one of the two worldviews.</p>
<p>Authoritarianism predicted a strong reliance on religion, aligning with previous studies linking traditional obedience to religious fundamentals. In contrast, openness to new experiences predicted a stronger reliance on a scientific worldview. This aligns with past work showing that openness correlates with the deliberate cognitive processes necessary for scientific thinking.</p>
<p>The researchers also uncovered some unexpected relationships between personality and scientific thinking. People with lower levels of extraversion tended to rely more on science to make sense of the world. Lower levels of emotional stability also predicted a higher reliance on science.</p>
<p>The researchers measured emotional stability by looking at the lack of chronic negative emotions, such as feeling nervous, sad, tense, or irritable. They had not anticipated this result regarding emotional stability and science. Previous literature generally links scientific achievement with different emotional profiles.</p>
<p>The critical thinking tests did not yield any firm conclusions. The results from the logic puzzles were not statistically significant for predicting either worldview. Demographics played a small role as well, with older individuals and women being slightly more likely to rely on religion.</p>
<p>Like all scientific investigations, this project has limitations that provide context for the results. The study relied on a single survey taken at one point in time, meaning the researchers cannot definitively prove that childhood experiences cause specific adult worldviews. The data only shows a mathematical association between these different factors.</p>
<p>Additionally, the participants had to rely on their memories to report their childhood experiences. Adult memories can be flawed, and a person’s current belief system might color how they remember their parents’ actions from decades ago. A person who currently dislikes religion might selectively remember their caregivers’ religious behaviors differently than someone who embraces their faith.</p>
<p>The study also did not ask participants to specify their exact religious denominations. Future research could explore whether these patterns hold true across specific faith traditions, such as Christianity, Islam, or Judaism. Investigating these distinct groups might reveal subtle differences in how credibility-enhancing displays influence children in different cultural contexts.</p>
<p>Psychologists will need to conduct longitudinal studies to track people over many years, starting in childhood and continuing into adulthood. Tracking children over time would eliminate the problem of faulty memory and provide a clearer picture of how worldviews evolve. Expanding this line of inquiry will help experts understand how early environments shape the way we navigate the universe.</p>
<p>The study, “<a href="https://doi.org/10.1080/2153599X.2024.2363768" target="_blank">Childhood experiences and personal traits as predictors of reliance on science and on religion to make sense of the world: results of a national US study</a>,” was authored by Crystal L. Park, Adam B. David, Jeffrey D. Burke, and Lisa Annunziato.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>