<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-map-the-hidden-architecture-of-the-brains-default-mode-network/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists map the hidden architecture of the brain’s default mode network</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 5th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://www.nature.com/articles/s41593-024-01868-0" target="_blank" rel="noopener">Nature Neuroscience</a></em> sheds light on the structural foundations of the brain’s default mode network, a system of regions long associated with internally focused thought, memory, and self-reflection. Using postmortem brain tissue and advanced neuroimaging, researchers found that this network is composed of distinct anatomical types of brain tissue, each with different roles in processing information. The findings help explain why the default mode network is involved in such a wide variety of mental states—from introspection to decision-making—and suggest that its structure enables a unique balance of communication across the brain.</p>
<p>The default mode network, or DMN, is one of the most widely studied yet poorly understood brain systems in neuroscience. It was originally discovered through brain scans showing that certain regions become less active when a person focuses on an external task, like solving a math problem. But over time, researchers noticed these same regions were also active during a wide range of cognitive activities, such as daydreaming, remembering the past, imagining the future, and even making complex decisions. This unexpected versatility raised fundamental questions about what the DMN does and how it manages to participate in such seemingly contradictory mental functions.</p>
<p>One key to solving this puzzle, the researchers hypothesized, lies in the network’s anatomy. Much of the past research on the DMN has used functional MRI to track patterns of activity, but less attention has been paid to the underlying microstructure that might shape those functions. The authors of the new study believed that a deeper understanding of the DMN’s cellular and anatomical features could clarify how it supports such a diverse array of mental processes.</p>
<p>“The default mode network has a fascinating history in neuroscience. It was first identified as a group of brain regions that become less active when people engage in a specific task,” explained study author Casey Paquola, the head of <a href="https://multiscale-neurodevelopment.github.io/" target="_blank" rel="noopener">the Multiscale Neurodevelopment Lab</a> at the Institute of Neuroscience and Medicine (INM-7) at the Helmholtz Associations’ Research Center Jülich.</p>
<p>“But over time, researchers noticed that these same regions were actually activated during a wide variety of tasks — from recognizing faces to making decisions. This led to a lot of diverse theories about what the DMN is and what its role is in cognition. So, despite being discussed in tens of thousands of studies since 2001, it remained enigmatic.”</p>
<p>“Notably, the vast majority of those studies used fMRI to study the DMN. As I work at the intersection of neuroanatomy and neuroimaging, I thought it would be a novel angle to investigate this functional entity through the lens of neuroanatomy. We wanted to test whether the hypotheses based on functional MRI were supported or rejected based on the architecture of the DMN.”</p>
<p>To do this, the research team combined two powerful tools: postmortem brain histology and in vivo neuroimaging. The histological data came from a high-resolution 3D reconstruction of a human brain donated after death, allowing for precise mapping of cell densities and tissue types across the cortex. The team analyzed nearly 7,400 thinly sliced brain sections, stained to reveal the shapes and layers of cells, and reconstructed them into a 3D model known as “BigBrain.” These anatomical maps were then compared with functional brain networks defined by resting-state MRI scans in living participants.</p>
<p>The researchers focused on how different parts of the DMN varied in their cytoarchitecture—the arrangement and characteristics of cells across different layers of the cortex. They identified several types of cortical microstructure within the DMN, ranging from areas specialized for processing sensory information to regions associated with memory and internal thought. This showed that the DMN is not uniform but rather includes a mix of cell types, each potentially suited for different tasks.</p>
<p>Using data-driven modeling, the team found a spectrum—or axis—of cytoarchitectural variation across the DMN. Some regions had highly layered structures with dense mid-level cell populations, while others had flatter, less differentiated profiles. These differences aligned with known anatomical types, such as eulaminate regions that process external information and agranular areas often linked to self-generated thought and emotion.</p>
<p>The researchers then examined how these anatomical differences relate to the DMN’s connectivity. By analyzing diffusion MRI data from healthy adults, they assessed how efficiently information could travel along structural pathways linking different brain regions. They found that parts of the DMN with highly layered cytoarchitecture were more strongly connected to other brain regions, particularly those involved in sensory processing. These regions appeared to serve as “receivers,” efficiently gathering input from across the brain.</p>
<p>In contrast, areas with flatter cytoarchitecture were relatively isolated from sensory input, suggesting that they form an “insulated core” within the DMN. These more internally focused regions, like the anterior cingulate cortex, may support self-referential processing or maintain mental representations that are less dependent on immediate sensory information.</p>
<p>To further understand the flow of information within the DMN, the researchers used a modeling technique called regression dynamic causal modeling. This method estimates how activity in one brain region influences another, providing a measure of functional input and output. The results confirmed that “receiver” regions in the DMN received strong input from a variety of other brain systems, while the more insulated areas were relatively unaffected by outside signals.</p>
<p>Perhaps most strikingly, the researchers found that the DMN differs from other brain networks in how it sends information back out. While many networks tend to favor certain types of connections, the DMN distributed its output evenly across all levels of the brain’s processing hierarchy—from low-level sensory areas to high-level association regions. This suggests that the DMN plays a unique integrative role, capable of influencing thought and behavior at multiple levels.</p>
<p>To verify that these patterns weren’t just artifacts of group-level analyses, the team conducted a replication study using ultra-high field 7-Tesla MRI in individual participants. This allowed them to map microstructural variation and connectivity within each person’s brain. The results were consistent with the earlier findings, supporting the idea that the DMN’s unique structure and connectivity patterns are present at the level of individual brains.</p>
<p>“Given the DMN is largely expanded in humans relative to other mammals, more so than any other functional network, an interesting take-away from this study is that this unique network of brain regions is capable of enriching our interpretation of the world, by coloring our world view with other types of information, from autobiographical memory or social perspectives,” Paquola told PsyPost. “In other words, this network operates in a very different way to standard sensory processing, and that may help humans to have a richer understanding of the world around us.”</p>
<p>But as with all research, there are limitations. The detailed anatomical mapping was based on a single postmortem brain, and while replication was attempted with high-resolution imaging in living subjects, further validation across more diverse samples is needed. The study also focused on healthy adults, leaving open questions about how the DMN might develop during childhood or change in mental illness.</p>
<p>Future research will aim to understand how the DMN’s anatomy evolves over time and how it interacts with cognitive development and psychiatric symptoms. For example, if certain subregions mature more slowly or become disrupted in disorders such as depression or schizophrenia, this could help explain why these conditions involve changes in self-perception or thought patterns.</p>
<p>“I’m very interested in how the brain develops, that is how it changes from infancy to adulthood,” Paquola said. “Moving forward, we’d like to understand more about how the maturation of the DMN coincides with cognitive maturation and changes in mental health.”</p>
<p>The study, “<a href="https://doi.org/10.1038/s41593-024-01868-0" target="_blank" rel="noopener">The architecture of the human default mode network explored through cytoarchitecture, wiring and signal flow</a>,” was authored by Casey Paquola, Margaret Garber, Stefan Frässle, Jessica Royer, Yigu Zhou, Shahin Tavakol, Raul Rodriguez-Cruces, Donna Gift Cabalo, Sofie Valk, Simon B. Eickhoff, Daniel S. Margulies, Alan Evans, Katrin Amunts, Elizabeth Jefferies, Jonathan Smallwood, and Boris C. Bernhardt.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/21-year-old-man-dies-after-jabbing-pencil-into-his-brain-during-psilocybin-trip/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">21-year-old man dies after jabbing pencil into his brain during psilocybin trip</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 5th 2025, 09:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>In a newly reported case, a 21-year-old Hispanic man in Texas died after stabbing himself in both eyes with a pencil during a bad psychedelic experience at home. The disturbing event, published in the <em><a href="https://doi.org/10.1016/j.ajoc.2025.102359" target="_blank">American Journal of Ophthalmology Case Reports</a></em>, provides a rare but sobering account of how the effects of psilocybin—the active compound in “magic mushrooms”—can, under certain conditions, result in tragic outcomes. The patient suffered severe brain damage after a pencil entered his skull through his eye socket, ultimately leading to his death despite intensive surgical care.</p>
<p>While psilocybin is being actively explored for its potential to treat mental health conditions like depression and post-traumatic stress disorder, the report underscores the need for greater awareness of potential psychiatric side effects, particularly when the substance is taken in unsupervised settings.</p>
<p>Psilocybin is a naturally occurring psychedelic compound found in certain species of mushrooms. Once ingested, it is converted into psilocin, which alters perception, mood, and thought patterns by stimulating serotonin receptors in the brain. Although it has a long history of traditional use, psilocybin has reemerged in recent years as a subject of modern medical interest due to its possible benefits in treating mood disorders.</p>
<p>Use of <a href="https://www.psypost.org/psilocybin-use-has-surged-in-the-united-states-since-2019/" target="_blank">psilocybin has surged in the United States</a> since 2019, following changes in public policy and a wave of scientific studies showing promising results. A recent analysis of five national datasets found that lifetime use of psilocybin among American adults rose from 10% in 2019 to over 12% in 2023. Past-year use jumped even more sharply, with a 44% increase among young adults and a 188% increase among those over 30. Despite growing interest in its therapeutic uses, psilocybin remains a federally controlled substance, and adverse events—though rare—can occur, particularly when consumed without medical guidance.</p>
<p>The young man described in the case report had a medical history that included attention-deficit/hyperactivity disorder and hypertension, which he managed with daily medications. He had no known history of depression, self-harm, or serious psychiatric illness. On the day of the incident, he returned home from work and consumed an unknown amount of psilocybin mushrooms.</p>
<p>Soon afterward, his behavior changed drastically. According to family members, he became agitated and paranoid—classic signs of a “bad trip.” Despite attempts by his stepfather and brother to calm him down, he broke a glass vase and began injuring himself. Then, in a horrific act, he placed a pencil upright on his desk and repeatedly drove his head onto it, penetrating both of his eye sockets. The pencil lodged deeply in the left eye and continued into his brainstem.</p>
<p>Emergency responders sedated and intubated him before transporting him to a hospital for advanced care. Initial brain scans showed that the pencil had punctured the pons, a region at the base of the brain vital for breathing and motor control. Imaging also revealed bleeding in the brain, skull fractures, and signs of vascular damage. Although the right eye had suffered a blowout fracture, the pencil had traveled through the left orbit and into the brainstem, causing a pontine hemorrhage and disruption of blood flow.</p>
<p>Over the next few days, a team of neurosurgeons and ophthalmologists worked to stabilize the patient and remove the foreign object. They used advanced imaging and surgical techniques, including vascular embolization and careful dissection, to extract the pencil. Despite these efforts, the patient’s condition worsened. He experienced ongoing brain swelling, additional bleeding, and increasing intracranial pressure. Eventually, he lost all brainstem reflexes and was declared brain dead five days after the injury.</p>
<p>While such a violent and tragic outcome is extremely rare, it is not entirely without precedent. Medical literature has documented other cases of self-inflicted injuries during psychotic episodes triggered by hallucinogens. In another recent case report, a 37-year-old man suffering from depression and alcohol abuse <a href="https://www.psypost.org/man-amputates-penis-with-an-axe-after-consuming-psilocybin-mushrooms/" target="_blank">severed his penis with an axe</a> after consuming psilocybin.</p>
<p>Most psilocybin-related adverse outcomes do not lead to death or severe injury. Psilocybin is generally considered to have a low risk of addiction and physical toxicity, but “bad trips” can lead to anxiety, confusion, hallucinations, and psychosis-like symptoms. One <a href="https://doi.org/10.1177/0269881116652578" target="_blank">survey of over 1,900 users</a> found that while the majority of “bad trips” resolved without medical intervention, 2.6% of respondents reported physically aggressive behavior, and 2.7% sought emergency care, while 11% acknowledged putting themselves or others at risk during the experience.</p>
<p>Although some might be tempted to draw broad conclusions from this single case, it is important to understand the nature and role of case reports in medicine. Case reports are detailed descriptions of unusual or novel clinical events involving one patient. They do not establish cause and effect or generalize to larger populations. However, they are valuable for identifying potential safety concerns, generating hypotheses for further research, and alerting clinicians to rare but serious risks. In this instance, the report highlights the potential dangers of unsupervised psychedelic use, particularly in individuals with unknown or latent vulnerabilities to psychiatric symptoms.</p>
<p>The patient’s history does not include any known mental illness or previous self-injurious behavior. However, the report notes that his biological father died by suicide, suggesting a possible hereditary vulnerability. This is consistent with existing concerns in the scientific literature that psychedelic substances may pose higher risks for individuals with a personal or family history of psychiatric illness.</p>
<p>More broadly, the case emerges at a time when public and scientific interest in psychedelics is at an all-time high. While early clinical trials have shown that psilocybin may help treat major depression and post-traumatic stress disorder, these results have come from structured settings with professional oversight, careful screening, and psychological support before, during, and after the experience. In contrast, recreational use—or self-medication without clinical guidance—can involve unknown doses, uncontrolled environments, and little preparation or safety planning.</p>
<p>The report, “<a href="https://doi.org/10.1016/j.ajoc.2025.102359" target="_blank">Self-Inflicted Transorbital Intracranial Foreign Body Following Ingestion of Hallucinogenic Psilocybin Mushrooms</a>,” was authored by Abigail M. Blanton, Pooja Parikh, Scott Zhou, Mohamed Mohamed, Rafael L. Ufret-Vincenty, and Ronald Mancini.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/major-study-points-to-evolved-psychology-behind-support-for-strongmen/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Major study points to evolved psychology behind support for strongmen</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 5th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1016/j.evolhumbehav.2025.106674" target="_blank" rel="noopener">Evolution and Human Behavior</a></em> suggests that people around the world are more likely to favor dominant, authoritarian leaders during times of intergroup conflict. Drawing on data from 25 countries, the researchers found consistent evidence that both perceived and actual conflict are linked to increased preferences for leaders with dominant traits. These findings support the idea that humans may be equipped with a psychological system that evolved to prioritize strong leadership when faced with external threats.</p>
<p>The study aimed to explore whether support for dominant leaders is a universal human tendency that becomes stronger in response to conflict. Across history, powerful figures—many with authoritarian traits—have often gained popular support during wartime or periods of social unrest. Yet, research has also shown that voters usually prefer leaders who are warm and competent. This raises the question: why do dominant leaders still rise to power so frequently, even when they may not represent voters’ default preferences?</p>
<p>One explanation is rooted in evolutionary psychology. Human ancestors often faced dangerous intergroup conflicts, such as attacks from rival tribes. In these contexts, following a physically dominant and aggressive leader may have increased group survival. The researchers behind this study, led by <a href="https://www.professormarkvanvugt.com/" target="_blank" rel="noopener">Mark van Vugt</a> (a professor at the Vrije Universiteit Amsterdam) and <a href="https://ps.au.dk/en/contact/staff/show/person/ll@ps.au.dk" target="_blank" rel="noopener">Lasse Laustsen</a> (an associate professor at Aarhus University), proposed that modern humans retain this instinct.</p>
<p>“Both of us have held a strong and long-lived interest in understanding why citizens and followers across societies come to prefer seemingly dominant, authoritarian and strong leaders over the alternatives,” van Vugt and Laustsen told PsyPost.</p>
<p>“Because we are both trained in evolutionary psychology, we both worked on projects trying to answer this question based on evolutionary models of followership and leadership. A common finding across our (and others’) findings is that the more followers tend to perceive society as conflict-ridden the more they turn to dominant, strong and authoritarian leaders.</p>
<p>“Thus, when we met at a workshop co-organized by Christopher von Rueden (University of Richmond) and Mark in 2017, we decided to test the universality of this relationship leveraging our professional networks to collect data across all continents and across as many countries as possible.</p>
<p>“Importantly, if—as we argue in our article and previous work—’the intergroup conflict – dominant leader nexus’ is rooted in evolved psychological systems then we should expect humans more or less everywhere to display increased preferences for dominant leaders when assigned to conflict situations (compared to no conflict situations).”</p>
<p>The research team conducted a large-scale, multi-country investigation involving 5,008 participants from a diverse set of nations including the United States, China, Kenya, Russia, and Chile. Participants were recruited from student populations, convenience samples, and nationally representative groups, depending on the country. Surveys were conducted online between October 2019 and November 2020.</p>
<p>The researchers designed a set of four tests to examine how intergroup conflict shapes leadership preferences. Participants were randomly assigned to one of three experimental conditions: a war scenario, a peace scenario, or a neutral control condition. Those in the war group were told to imagine their country was under threat, while those in the peace group were asked to imagine a calm and friendly international situation. Participants were then shown pairs of faces, one subtly altered to appear more dominant, and asked which person they would prefer to lead their country.</p>
<p>In the first test, participants in the war condition were significantly more likely to choose the dominant-looking faces as leaders. Across all countries, 54% of participants in the war condition preferred the dominant face, compared to 46% in the control condition and 42% in the peace condition. In other words, perceived conflict increased support for dominant-looking leaders, while peace reduced it. This pattern was consistent in 19 of the 25 countries, suggesting a broad cross-cultural effect.</p>
<p>“We were both surprised to see the high consistency of our experimental results (i.e. that subjects assigned to the war condition displayed stronger preferences for dominant leaders than subjects in the control or peace conditions) across countries,” van Vugt and Laustsen told PsyPost. “Yet, at the same time we were also surprised to see that the countries where this pattern was not supported was Nigeria and Russia. We can only speculate about the reasons why these countries depart from the overall pattern.”</p>
<p>The second test looked at participants’ explicit preferences for leadership traits, including dominance, warmth, and competence. Participants in the war condition were more likely to say they wanted a dominant leader, and less likely to prioritize warmth. Preferences for competence, however, remained stable regardless of the scenario. This supports the idea that conflict specifically increases desire for dominance, not just any leadership trait.</p>
<p>The third test focused on individual differences. People who scored higher on measures of right-wing authoritarianism and social dominance orientation—psychological scales that assess how dangerous or competitive someone believes the world to be—were more likely to prefer dominant leaders. These tendencies remained significant even after controlling for age, gender, income, and education.</p>
<p>The fourth and final test looked at country-level factors. In countries with a history of armed conflict or high military spending, people expressed stronger average preferences for dominant leadership. For example, countries that had participated in more intergroup wars or spent more per capita on their military showed higher support for dominant traits in leaders. Although these correlations were smaller than those found in the other tests, they offered additional support for the theory.</p>
<p>Taken together, the results suggest that humans may have an evolved tendency to favor dominant leaders in response to perceived threat. This “followership psychology” likely developed over millennia when strong, forceful leaders were better able to protect groups from external enemies. Although such instincts may have been useful in small-scale societies, they may not serve us as well in the context of modern nation-states and complex international diplomacy.</p>
<p>“We see our results as important for theoretical reasons, but also for understanding ongoing conflicts around the world and rising preferences for dominant and strong leaders,” van Vugt and Laustsen explained. “Theoretically, our results add yet another piece of evidence that humans reasons about leadership and followership based on evolved psychological systems tightly linked to perceptions of conflict. This is important for understanding how humans facing real war (e.g., the ongoing war in Ukraine) or threats of future conflicts and attacks (e.g. the Chinese threat on Taiwan) reason about leadership.”</p>
<p>“In particular our findings are important if one wishes to de-escalate conflicts. One take on the ‘the intergroup conflict – dominant leader nexus’ is that it creates a vicious circle in which dominant leaders are preferred due to rising conflict, but these same dominant leaders are likely to further intensify conflicts through their aggressive and dominant tactics and behaviors. Consequently, understanding that breaking this nexus is going to be hard as it likely rests on evolved psychological systems and intuitions constitutes a main take-away from our article.”</p>
<p>While the findings offer strong support for the conflict hypothesis, there are some limitations. The majority of participants were university-educated and recruited online, which could limit generalizability to broader populations. And while the face-based tasks were designed to subtly manipulate perceptions of dominance, there is always a risk that participants guessed the study’s purpose.</p>
<p>“As with most social science survey experiments, readers of our article should keep in mind that results rest on self-reported preferences by participants who may not have any concrete experience with war or conflict (yet, we also want to stress that given the wide variety of sampled countries war and conflict experience is probably higher in our study than in most previous work on the topic),” the researchers noted.</p>
<p>“However, in another project based on a sample of Ukrainian individuals in the first months after the Russian invasion, we find Ukrainians thinking about the ongoing war display stronger preferences for dominant leaders than Ukrainians thinking about a peaceful future (see this link for a short blog-post about the study: <a href="https://www.psychologytoday.com/gb/blog/naturally-selected/202205/war-and-the-preference-strong-leader" target="_blank" rel="noopener">https://www.psychologytoday.com/gb/blog/naturally-selected/202205/war-and-the-preference-strong-leader</a>). That is, the results reported in our current article replicate patterns obtained from individuals facing actual war providing further credence to the key message that preferences for dominant leaders are tightly connected to perceptions and experiences of intergroup conflict and war.”</p>
<p>The researchers are now exploring several follow-up questions. “We are pursuing different directions in future projects. Some of our work ties impressions of leader dominance to various kinds of behavior and opinion statements from leaders. For instance, one project tests if undemocratic behavior affects impressions of dominance, which—based on preliminary results—seems to be the case.”</p>
<p>“Another project investigates if “the intergroup conflict – dominant leader nexus” is already in place among pre-school children or, if it is not, at what age conflict becomes linked to preferences for dominant leaders. Finally, other projects investigate if other types of contexts have also molded human leader preferences giving rise to preferences for other character traits in leaders when societies face other kinds of situations and scenarios.”</p>
<p>The authors also noted that the logistical demands of this study were immense, requiring collaboration with researchers across dozens of countries and enduring delays due to the COVID-19 pandemic. Despite the challenges, the team believes the effort was worthwhile. By uncovering a consistent global pattern in how people respond to conflict, the study sheds light on the psychological roots of political behavior and helps explain why strongman leaders continue to appeal to so many—especially in turbulent times.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.evolhumbehav.2025.106674" target="_blank" rel="noopener">Cross-cultural evidence that intergroup conflict heightens preferences for dominant leaders: A 25-country study</a>,” was authored by Lasse Laustsen, Xiaotian Sheng, M. Ghufran Ahmad, Laith Al-Shawaf, Benjamin Banai, Irena Pavela Banai, Michael Barlev, Nicolas Bastardoz, Alexander Bor, Joey T. Cheng, Anna Chmielińska, Alexandra Cook, Kyriaki Fousiani, Zachary H. Garfield, Maliki Ghossainy, Shang E. Ha, Tingting Ji, Benedict C. Jones, Michal Kandrik, Catherine Chiugo Kanu, Douglas T. Kenrick, Tobias L. Kordsmeyer, Cristhian A. Martínez, Honorata Mazepus, Jiaqing O, Ike Ernest Onyishi, Boguslaw Pawlowski, Lars Penke, Michael Bang Petersen, Richard Ronay, Daniel Sznycer, Gonzalo Palomo-Vélez, Christopher R. von Rueden, Israel Waismel-Manor, Adi Wiezel, and Mark van Vugt.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/your-brains-insulation-might-become-emergency-energy-during-a-marathon/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Your brain’s insulation might become emergency energy during a marathon</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 5th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://www.nature.com/articles/s42255-025-01244-7" target="_blank">Nature Metabolism</a></em> suggests that intense endurance exercise, such as running a marathon, may cause a temporary reduction in myelin content in the brain. Using advanced magnetic resonance imaging (MRI) techniques, researchers observed that myelin—a fatty substance that insulates nerve fibers and supports brain function—was significantly reduced shortly after marathon running, but returned to normal levels within two months. These findings point to a previously unrecognized form of brain plasticity, where myelin may serve as an energy reserve under extreme metabolic conditions.</p>
<p>The motivation for the study came from a growing body of evidence showing that myelin, which makes up the white matter in the brain, is not only important for nerve conduction but may also play a role in brain metabolism. Myelin is composed mainly of lipids, and recent animal studies have suggested that in times of energy shortage, the body might tap into these lipid-rich structures to support brain function. The researchers wanted to test whether this could also happen in humans undergoing intense physical exertion, such as marathon runners who often experience significant depletion of their body’s primary energy sources during a race.</p>
<p>“I am a runner and have been studying myelin biology for three decades. In seeking a framework to explore the notion that brain myelin serves as an energy source, I considered the metabolic stress of marathon running as a potential way to test this hypothesis,” explained study author <a href="https://www.neurobiologylab.org/" target="_blank" rel="noopener">Carlos Matute</a>, a professor at the University of the Basque Country and CIBERNED-Instituto Carlos III.</p>
<p>To investigate this possibility, the research team recruited 10 experienced marathon runners between the ages of 45 and 73. These volunteers, who were in good health and did not receive any financial compensation, participated in various marathon events including city and mountain races. Each runner underwent a series of MRI scans: once within 48 hours before their race, again 24 to 48 hours after, and for a subset of participants, at two weeks and two months after completing the marathon.</p>
<p><figure aria-describedby="caption-attachment-227716" class="wp-caption aligncenter"><a href="https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration.jpg"><img fetchpriority="high" decoding="async" class="size-large wp-image-227716" src="https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration-1024x512.jpg" alt="" width="1024" height="512" srcset="https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration-1024x512.jpg 1024w, https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration-300x150.jpg 300w, https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration-768x384.jpg 768w, https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration-360x180.jpg 360w, https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration-750x375.jpg 750w, https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration-1140x570.jpg 1140w, https://www.psypost.org/wp-content/uploads/2025/06/parts-of-a-neuron-illustration.jpg 1500w" sizes="(max-width: 1024px) 100vw, 1024px"></a><figcaption class="wp-caption-text">[Adobe Stock]</figcaption></figure>The researchers used a sophisticated imaging approach called multicomponent relaxometry MRI, which allows them to generate maps of the brain’s myelin water fraction (MWF). This metric is considered a reliable, though indirect, indicator of how much myelin is present in different brain regions. The technique captures the proportion of water molecules trapped between the layers of myelin sheaths, which reflect myelin’s presence and integrity.</p>
<p>Initial scans taken before the race showed consistent MWF patterns across participants, with normal individual variability. After the race, however, the MRI data revealed a significant reduction in MWF in multiple white matter regions of the brain, particularly in areas linked to motor coordination, sensory processing, and emotional regulation. Some of the most affected regions included the corticospinal tract, pontine crossing tract, and cerebellar peduncles—tracts involved in movement and balance.</p>
<p>On average, the MWF dropped by as much as 28% in certain tracts within two days of completing the marathon. Notably, the reductions were observed in both hemispheres of the brain and did not occur uniformly across all regions. The changes were especially pronounced in highly myelinated white matter areas, while most grey matter regions remained unaffected, likely due to their lower baseline myelin content and limited sensitivity of the imaging method in those regions.</p>
<p>Importantly, the researchers ruled out several alternative explanations for the MWF changes. Dehydration, for instance, could alter water distribution in the brain and skew imaging results, but no significant changes in brain volume or regional water content were detected. Other potential confounding factors, such as brain swelling (edema), iron level fluctuations, and MRI signal orientation effects, were also considered unlikely contributors based on previous studies and the design of this research.</p>
<p>Two weeks after the race, follow-up scans showed that myelin levels were beginning to rebound but had not yet returned to pre-race levels. By the two-month mark, however, MWF values had fully recovered in all previously affected brain regions. This reversible pattern indicates that the reduction in MWF was not a sign of long-term damage but rather a temporary adaptation—what the authors describe as a form of “metabolic myelin plasticity.”</p>
<p>“The main limitation was technical—I initially believed it would be difficult to detect changes in myelin content,” Matute told PsyPost. “So it was surprising to observe that myelin levels can rapidly decrease within just 3 to 4 hours, the typical duration of a marathon, and then recover within a few weeks. This finding highlights a key point: the brain is more dynamic and plastic than previously thought. Another unexpected discovery was that these changes occur across a wide age range; the runners in our study were between 45 and 73 years old.”</p>
<p>The findings support the “idea that the brain harbors a substantial fat depot—myelin—which could potentially be used to fuel its activity,” Matute said. During a marathon, the body rapidly depletes its carbohydrate stores and turns to fat for energy. The brain may follow a similar strategy, mobilizing lipid reserves stored in the myelin sheath to sustain its function. Animal research has shown that glial cells—the support cells that produce myelin—can metabolize fatty acids through a process known as β-oxidation to generate energy during periods of glucose shortage. This metabolic flexibility may help protect nerve fibers and maintain communication within the brain under extreme physical stress.</p>
<p>Although the observed reduction in myelin was modest and transient, the implications are significant. If myelin lipids can be used as an emergency energy source, this may shed light on how the brain copes with metabolic challenges beyond exercise, including malnutrition or neurological diseases characterized by impaired energy balance. In fact, studies have shown that myelin integrity is affected in conditions like anorexia nervosa and neurodegenerative diseases, raising questions about whether similar metabolic mechanisms are involved.</p>
<p>There are some limitations to the study. The sample size was small, with only 10 participants, and most follow-up data were limited to subsets of this group. The researchers also could not directly measure myelin at the cellular level, as no non-invasive tools currently exist to do so with complete accuracy. MWF, while reliable, is considered a semiquantitative measure and can be influenced by factors unrelated to myelin degradation, such as minor changes in water distribution or tissue composition.</p>
<p>Future research will need to confirm these findings in larger and more diverse populations. The authors also plan to investigate whether the temporary loss of myelin observed after a marathon has any short-term effects on brain function, cognition, or mood. These follow-up studies could help determine whether changes in myelin content translate into measurable differences in how the brain performs after intense physical activity.</p>
<p>“The next steps involve investigating whether changes in brain function and cognition accompany the observed alterations in myelin,” Matute explained. “In the long term, our goal is to uncover the cellular and molecular mechanisms that mediate myelin consumption as an energy source for the brain—and its subsequent recovery. Gaining insight into these questions may aid in the development of therapies for patients with demyelinating diseases such as multiple sclerosis, as well as for addressing the impact of age-related myelin decline on brain function and cognition.”</p>
<p>The study, “<a href="https://doi.org/10.1038/s42255-025-01244-7" target="_blank">Reversible reduction in brain myelin content upon marathon running</a>,” was authored by Pedro Ramos-Cabrer, Alberto Cabrera-Zubizarreta, Daniel Padro, Mario Matute-González, Alfredo Rodríguez-Antigüedad, and Carlos Matute.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/developmental-prosopagnosia-even-mild-face-blindness-can-severely-disrupt-daily-life/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Developmental prosopagnosia: Even ‘mild’ face blindness can severely disrupt daily life</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 4th 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Have you ever been ignored by someone you knew when you bumped into them in the street or at an event? If so, you probably thought they were being rude. But they might have face blindness – a condition officially known as <a href="https://www.nhs.uk/conditions/face-blindness/">developmental prosopagnosia</a>.</p>
<p>In a <a href="https://plos.io/3Y6m9Rg">new study</a> my colleagues and I conducted, 29 adults with face blindness revealed the daily challenges they face. Ten of the participants said they could not reliably recognise immediate family members, and 12 couldn’t recognise closest friends in out-of-context or unexpected encounters. Yet many felt it was socially difficult to admit these struggles.</p>
<p>One of the participants didn’t recognise her husband of 30 years when he unexpectedly came to pick her up from the airport. Another described how “when I am off work for a week and come back it’s really hard to figure out who is who”.</p>
<p>Although public awareness of face blindness is low, there is a high chance that you already know someone with face recognition difficulties. Around <a href="https://doi.org/10.1080/02643290903343149">one in 50 people</a> have developmental prosopagnosia, a lifelong condition that causes severe face recognition difficulties despite otherwise normal vision, IQ and memory.</p>
<p>Researchers usually describe not being able to recognise close friends and family as a “severe” form of prosopagnosia, but our new study – conducted with a colleague at Dartmouth College in the US – shows that even people classified as having “mild” prosopagnosia can have serious difficulties in daily life. This suggests that prosopagnosia diagnosis should consider real-life experiences, not just lab tests.</p>
<p>Most face-blind participants who took part in the research had tried various strategies to recognise people. However, these methods required huge mental effort and often didn’t work. For example, keeping detailed notes, or even spreadsheets, with descriptions and cues about people they have met. Or mentally trying to associate a name with a personally distinctive feature.</p>
<p>However, participants admitted their strategies were often “exhausting” and were particularly difficult to use at work when they were busy, concentrating on a task, or because colleagues wore uniforms or similar work clothing.</p>
<p>Some prosopagnosics said they used unusual ways to recognise others, for example, by smell. Another said that worrying about a face distracted them, so they found it more helpful to look at people from behind to work out who they were.</p>
<p>Prosopagnosics told researchers how their condition caused them considerable difficulties at school, at work and in everyday social situations. Two-thirds of the prosopagnosics said they could recognise fewer than ten familiar faces. <a href="https://pubmed.ncbi.nlm.nih.gov/30305434/">Previous research</a> suggests most adults recognise around 5,000 faces, so this difference is huge.</p>
<p>A widespread worry among people with face blindness was being misjudged as rude, uncaring, or even “a bit dim” by others who didn’t understand the condition. This concern often led to social anxiety and reduced self-confidence in social situations.</p>
<p>A common coping strategy was to avoid social gatherings or to deliberately keep social circles small to limit the number of faces people had to try and learn. But these strategies sometimes had a downside.</p>
<p>Looking back on their lives, some people felt that their face recognition difficulties had left them socially isolated, or with “poorly developed” social skills because they hadn’t mixed much with others while growing up.</p>
<p>Prosopagnosics were asked what they thought future research into face blindness should focus on. Their top priority was improved awareness and understanding that this condition exists and how it affects people. They thought this was particularly important for employers, schools and medical staff – but also for the general public.</p>
<p>The research found that many simple things could make life much easier for people with face recognition difficulties. Providing large name badges at events and conferences is a simple but helpful adjustment.</p>
<p>Participants said they found it a huge relief when meetings started with a round of introductions, the chair always addressed people by name, or they were given seating plans. Hot desking causes problems, so keeping a regular seating plan in a workplace or classroom can help face-blind people learn who usually sits where.</p>
<p>If you are meeting a face-blind friend, sending a quick message beforehand to let them know what you are wearing and exactly where you are sitting can also help.</p>
<h2>A form of neurodivergence</h2>
<p>My colleagues and I believe that developmental prosopagnosia should be considered a type of neurodivergence. This term describes someone whose brain works differently from what is considered typical. It usually includes people with autism, ADHD, dyslexia and dyspraxia.</p>
<p>Recognising face blindness as a form of neurodivergence isn’t just about awareness, it’s about dignity, inclusion and making everyday life easier for thousands of people.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/254644/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/even-mild-face-blindness-can-cause-serious-difficulties-in-daily-life-new-study-254644">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/ugly-bystanders-boost-beauty-study-finds-background-faces-shape-personality-judgments/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Ugly bystanders boost beauty: Study finds background faces shape personality judgments</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 4th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in <a href="https://doi.org/10.1007/s12144-025-07401-1" target="_blank">Current Psychology</a> suggests that people form more positive impressions of a face when it appears alongside others, and that the emotional impact of surrounding faces plays an important role in shaping these impressions. The study found that even moderately attractive faces were judged more favorably when paired with less attractive ones, a phenomenon linked to emotional brain responses measured through electrical activity.</p>
<p>Researchers conducted the study to investigate how people evaluate personality traits from faces in social situations where more than one face is visible. Most previous research has focused on isolated face judgments, often finding that people associate facial attractiveness with positive characteristics like kindness, confidence, or intelligence. But real-life situations rarely involve judging faces in isolation. The authors wanted to understand whether the attractiveness of nearby faces might affect how a person is perceived, especially in cases where one face stands out from the others.</p>
<p>The research team used a neuroscience method called event-related potentials (ERP) to measure brain responses during trait inference. Specifically, they examined how participants responded to pairs of faces with differing attractiveness levels and how that affected their judgments about the target face. Participants were shown pairs of female faces—one moderately attractive and one either highly or lowly attractive—followed by a word describing a personality trait. They were then asked to judge whether the word matched the personality of the indicated target face. The brain’s response to the word was recorded, particularly looking at components related to emotional processing and semantic conflict.</p>
<p>A total of 47 college students (aged 18–24) participated in the study, all of whom were right-handed, had normal or corrected vision, and reported no history of neurological or psychiatric disorders. One participant was excluded due to excessive noise in the EEG data. Each participant completed 80 trials in which they viewed pairs of faces and then judged whether a trait word was a good fit for one of the faces. The researchers focused on “medium attractiveness” faces as targets and paired them either with highly attractive or lowly attractive background faces. Importantly, the target face was always of medium attractiveness, while the paired face varied in attractiveness.</p>
<p>The study used a total of 160 female face photographs, pre-rated for attractiveness by an independent group. Each photo was edited to remove external features like hair and accessories, ensuring that judgments were based on facial structure alone. The trait words used in the study consisted of five positive traits (such as “kind” and “friendly”) and five negative traits (such as “rigid” and “indifferent”). Participants responded by indicating whether they believed the trait word described the face.</p>
<p>The results showed that participants were generally more likely to associate target faces with positive traits than with negative ones. Across all conditions, 66% of positive trait words were judged as matching the target face, compared to only 29% of negative trait words. But a more interesting pattern emerged when the background face’s attractiveness was taken into account. The difference in positive versus negative judgments was significantly larger when the background face was unattractive compared to when it was attractive. In other words, participants were more likely to view the moderately attractive target face as especially positive when it was paired with a less attractive face.</p>
<p>The researchers also looked at reaction times. Participants responded faster when the background face was low in attractiveness compared to when it was high. This suggests that the emotional impact of the pairing may have made trait judgments easier in some contexts. To explore this further, the study examined specific ERP components—brain signals that reflect how people process information.</p>
<p>One key component, the N400, is linked to detecting mismatches between expectations and actual input. When a word doesn’t semantically fit with a previous stimulus—like an unexpected trait description following a face—the N400 amplitude becomes more negative. In this study, the N400 was significantly larger for negative trait words than for positive ones, indicating greater conflict when participants had to associate a negative word with a target face. This suggests that faces were generally being interpreted as more consistent with positive traits, especially when paired with other faces.</p>
<p>Another component of interest was the late positive potential (LPP), which reflects emotional engagement. Here, the researchers found a significant difference: low attractiveness background faces elicited a stronger emotional response, as reflected in larger LPP amplitudes. This heightened emotional response may explain why trait inferences about target faces became more positive in those contexts. Notably, early-stage components related to face detection and attention (N170 and EPN) did not differ significantly between conditions, suggesting that the emotional influence emerged later in the processing sequence.</p>
<p>The findings point to a dynamic interplay between visual context, emotional response, and social evaluation. When people see multiple faces, their judgments about one face are shaped not just by that face’s features, but also by how it compares to the others and the emotional reactions those comparisons evoke. Interestingly, less attractive background faces had a stronger influence than highly attractive ones, possibly due to the greater emotional contrast they created. This pattern supports the idea that emotional processing plays a central role in how people evaluate others, even when they are not explicitly aware of doing so.</p>
<p>The study offers a new angle on the well-known “beauty is good” stereotype by showing that judgments can be context-dependent and influenced by emotional cues from surrounding faces. While attractive people are still generally seen more favorably, this favorability can extend to others when the context involves unflattering comparisons. Emotional responses to background faces may amplify or soften how target faces are perceived, especially when the observer is processing multiple faces simultaneously.</p>
<p>One limitation of the study is that it only used female faces. This was done to control for possible gender effects and to align with previous studies, but it means the results may not generalize to judgments of male faces or to cross-gender comparisons. Also, because all participants were heterosexual young adults, future studies would need to examine whether these findings hold in more diverse populations. Another limitation is the use of highly standardized facial images. While this helped control variables, it may limit the ecological validity of the results, as real-life impressions involve more complex facial cues such as hairstyle, expression, and gaze direction.</p>
<p>The researchers also note that they did not include a neutral baseline condition where only the target face was shown. Including such a comparison in future research could help determine whether high- and low-attractiveness background faces enhance or suppress positive trait inferences. Additionally, while this study focused on specific ERP components, other brain signals—such as the P300 or P2, which are involved in attention and detection—could also play a role in processing multiple faces.</p>
<p>The study, “<a href="https://doi.org/10.1007/s12144-025-07401-1" target="_blank">Getting close to beauty makes you better: the influence of background facial attractiveness on trait inferences</a>,” was authored by Shangfeng Han, Chen Hu, Wenxiu Su, Yuhong Sun, Junlong Huang, Shimin Fu, and Yuejia Luo.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-uncover-key-role-of-thyroid-hormones-in-fear-memory-formation/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists uncover key role of thyroid hormones in fear memory formation</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 4th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://www.nature.com/articles/s41380-024-02679-2" target="_blank">Molecular Psychiatry</a></em> suggests that the thyroid hormone system in the brain may be a powerful driver of how fear memories are formed. Thyroid hormone signaling in the amygdala—the part of the brain involved in processing emotions—was not only activated by fear learning, but also necessary for storing fear memories. Boosting thyroid hormone activity strengthened fear memories, while blocking it impaired them. These results may help uncover new treatment pathways for trauma-related disorders such as post-traumatic stress disorder (PTSD).</p>
<p>The amygdala is known to be essential for learning to associate danger with a particular stimulus—such as a tone paired with a shock in laboratory settings. This process, known as Pavlovian fear conditioning, has long been used in animal research to study the brain’s response to threat. Meanwhile, the thyroid hormone system has traditionally been associated with metabolism and early brain development. But it is increasingly being linked to mood, anxiety, and memory. Still, researchers have had limited understanding of how thyroid hormones influence the adult brain’s ability to store emotionally significant memories—especially in brain regions like the amygdala.</p>
<p>Thyroid hormones such as triiodothyronine (T3) interact with specific receptors in the brain called thyroid hormone receptors (TRs). These receptors act as transcriptional regulators: when they bind to T3, they turn on genes that help regulate brain plasticity—the brain’s ability to adapt based on experience. In their unbound state, these receptors suppress gene activity. This dual function makes them a promising target for exploring how hormones can shape emotional learning at the molecular level.</p>
<p>To investigate this, the research team focused on mice undergoing fear conditioning. The study used a widely accepted experimental procedure: mice were exposed to a tone followed by a mild foot shock, creating an association between the two. Researchers examined whether genes involved in the thyroid hormone pathway were activated in the amygdala following this learning event. They also tested whether directly manipulating thyroid hormone activity in the amygdala would influence fear memory formation.</p>
<p>The study involved several sets of experiments. In one, the researchers surgically implanted tiny cannulas into the amygdala of adult male mice. These cannulas allowed them to precisely deliver thyroid hormone T3 or a TR antagonist—either before or after the mice experienced tone-shock pairings. The goal was to observe whether adding or blocking hormone signaling in this brain region would enhance or impair the consolidation of fear memories, which is the process by which short-term memories become stable over time.</p>
<p>They found that administering T3 directly into the amygdala led to stronger fear memories, as measured by increased freezing behavior when the tone was later played. In contrast, blocking thyroid receptors with an antagonist drug weakened fear memory. These changes were not due to differences in how mice learned during the training session itself, but rather reflected how well they retained the memory afterward.</p>
<p>In additional experiments, the researchers created a hypothyroid state in mice by feeding them a low-iodine diet combined with a chemical that blocks thyroid hormone production. As expected, these mice showed lower levels of thyroid hormones in the blood. They also had impaired fear memory. Remarkably, the researchers were able to reverse this memory deficit by injecting T3 into the amygdala, confirming that local hormone signaling in this specific brain region was enough to rescue the behavioral impairment caused by systemic hormone deficiency.</p>
<p>The team also examined which specific genes in the amygdala were regulated by thyroid hormones during fear learning. Using a technique called quantitative PCR, they identified several genes whose activity levels changed following fear conditioning. These included both genes that are usually activated by T3 (such as Dio2 and Reln) and genes that are typically suppressed (such as Trh and Aldh1a3). These transcriptional changes were consistent with the idea that thyroid hormones help coordinate the brain’s molecular response to threat, priming it for storing emotional memories.</p>
<p>The researchers used a type of imaging called RNAscope to validate that some of these genes were being turned on in the amygdala itself, not in nearby regions. They also confirmed that these changes happened within hours of fear learning, suggesting a direct link between hormone signaling and the early phases of memory formation.</p>
<p>Beyond memory, the researchers also tested how amygdala thyroid hormone activity influenced anxiety-related behaviors. In a separate open field test, mice given T3 in the amygdala spent less time exploring the center of the arena and showed less movement overall. These behaviors suggest an increase in anxiety-like responses, aligning with the idea that thyroid hormones can influence emotional arousal beyond memory alone.</p>
<p>Overall, the study provides strong evidence that the thyroid hormone system is a dynamic and influential player in the adult brain’s emotional memory network. While past research has linked thyroid dysfunction to mood disorders like depression and anxiety, this study offers a direct mechanism by which thyroid hormones affect how emotionally significant events are encoded and remembered.</p>
<p>The findings also raise intriguing possibilities for clinical research. Thyroid hormone levels are not routinely measured in patients with PTSD or trauma-related symptoms, despite known associations between stress and thyroid function. If future research supports these findings in humans, thyroid hormone regulation could become a target for novel therapies aimed at preventing or modifying fear-related memory formation.</p>
<p>But the study has limitations. All of the experiments were conducted in male mice, leaving open questions about whether the same results would apply to females. The researchers also focused on relatively short time frames, mainly testing memory 24 hours after learning. It remains unclear how long the observed effects of T3 or its antagonists last, or how these hormone pathways interact with other known regulators of memory such as serotonin, dopamine, or stress hormones.</p>
<p>Future work could explore whether the same hormone pathways are involved in more chronic or generalized anxiety, and whether targeting these pathways could improve treatment outcomes in individuals with trauma-related disorders. Researchers may also investigate how early life stress interacts with thyroid signaling in the brain, potentially helping to explain why some individuals are more vulnerable to the long-term effects of trauma than others.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41380-024-02679-2" target="_blank">Evidence for thyroid hormone regulation of amygdala dependent fear-relevant memory and plasticity</a>,” was authored by Stephanie A. Maddox, Olga Y. Ponomareva, Cole E. Zaleski, Michelle X. Chen, Kristen R. Vella, Anthony N. Hollenberg, Claudia Klengel, and Kerry J. Ressler.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-study-reveals-four-psychological-profiles-of-gamers-linked-to-mental-health-and-attachment-styles/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New study reveals four psychological profiles of gamers linked to mental health and attachment styles</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jun 4th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em>Addictive Behaviors</em> has identified four distinct psychological profiles of video game players, each shaped by differences in emotional regulation, attachment style, and mental health. Based on data from over 5,000 gamers worldwide, the research found that the “dysregulated” and “relational” profiles were associated with greater psychological distress and insecure attachment patterns, while the “avoidant” and “engaged” profiles were linked to emotional stability and healthier relationships with gaming.</p>
<p>Researchers affiliated with ISPA – Instituto Universitário and the APPsyCI Applied Psychology Research Center conducted the study to better understand the complex factors that shape gaming behavior. While video games are a popular form of entertainment for people of all ages, concerns have grown about the potential for excessive or problematic gaming, especially in younger individuals. </p>
<p>Mental health challenges, substance use, attachment difficulties, and social environments can all contribute to how individuals engage with video games. However, many studies to date have focused on narrow aspects of gaming, such as time spent playing or clinical symptoms, without considering the broader psychological and social context.</p>
<p>“This study was born from a shared interest in understanding the psychological complexity of gaming behaviors, particularly in the context of Gaming Disorder, but we also wanted to know about non-problematic gaming behaviors,” said study authors Cátia Martins Castro (a PhD candidate in psychology) and David Dias Neto (an associate professor of psychology).</p>
<p>“One of the primary motivations was the desire to integrate the most relevant psychological factors — emotional regulation and motivation to play videogames — into profiles, and then comprehend the various dimensions, such as attachment styles, mental health, and gaming characteristics related to these profiles, that could one day support clinicians in their practice.”</p>
<p>“From a personal perspective (Cátia), this need was strongly felt in my clinical work, where it was often difficult to translate research into practical tools for assessing and supporting my clients who were gamers with problematic use.”</p>
<p>“We were also committed to ensuring inclusivity, both in terms of the types of games played and the diversity of the gaming population,” the researchers explained. “These are, to our knowledge, the first psychological profiles of gamers to include non-binary participants, to be generalizable across all game genres, and to be drawn from an international sample spanning 112 countries. This breadth gives the profiles substantial potential for clinical and cultural relevance.”</p>
<p>Specifically, their aim was to explore how mental health indicators, social and relational dynamics, and gaming-specific behaviors cluster together to form meaningful patterns among gamers. They focused particularly on how emotional regulation and motivations for gaming vary across profiles and how these profiles relate to risk factors for problematic gaming.</p>
<p>The study drew on responses from 5,255 individuals aged 16 to 69, with a mean age of approximately 25. Participants came from 112 countries and identified as men (about 50%), women (43%), or non-binary (9%). Data were collected through an online questionnaire shared on social media and gaming platforms. Participants answered questions about their gaming habits, emotional experiences, social connections, substance use, and attachment patterns.</p>
<p>To identify the psychological profiles, the researchers used a person-centered statistical approach that groups individuals based on similarities across multiple variables. They measured emotional regulation using a standardized scale assessing impulse control, emotional awareness, and the ability to act in goal-directed ways despite distress. They also assessed gaming motivations, such as playing for escapism, identity, or social connection. In addition, they examined attachment styles—patterns of relating to others developed early in life—as well as self-reported mental health symptoms and substance use.</p>
<p>The analysis revealed four distinct gamer profiles:</p>
<p><strong>Avoidant profile:</strong> Individuals in this group were generally older and reported low levels of psychological distress. They showed secure attachment patterns and a preference for offline social interactions. Their gaming motivations centered around personal exploration, autonomy, and recreation rather than social connection. This profile was also associated with lower levels of substance use and minimal use of social media for gaming-related communication.</p>
<p><strong>Engaged profile:</strong> This was the largest group and included gamers with good emotional regulation and secure attachment. Like the avoidant group, they showed low psychological distress and little substance use. However, they were more socially integrated into gaming communities and used platforms such as Discord, Twitch, and Instagram to connect with others. Smartphone gaming was common in this group. While they did not often play online games with others, they maintained strong offline social networks.</p>
<p><strong>Relational profile:</strong> Members of this group showed emotional regulation difficulties and higher levels of attachment avoidance. They were more likely to play games for social connection and identity reinforcement but had fewer offline social interactions. Although they experienced some functional impairments and showed higher risk behaviors such as hallucinogen use, they did not report high levels of overt psychological distress. Their gaming preferences leaned toward immersive, socially driven experiences, and they frequently used platforms like Steam and Twitch.</p>
<p><strong>Dysregulated profile:</strong> This group was made up of younger gamers who reported the highest levels of emotional distress and showed difficulties in all areas of emotional regulation. They had both attachment anxiety and avoidance, indicating significant interpersonal difficulties. They were more likely to use tobacco and energy drinks and showed signs of behavioral dysregulation, including risk to self or others. These gamers also spent more time gaming alone or with online friends and frequently used multiple social media platforms to engage with gaming communities. This group had the highest risk of developing gaming disorder.</p>
<p>The researchers found that these profiles were significantly shaped by age, emotional regulation, mental health symptoms, and patterns of online and offline interaction. The dysregulated profile stood out for its combination of psychological vulnerabilities and intense engagement with online gaming and social platforms, which the researchers suggest may reflect a maladaptive coping strategy. In contrast, the avoidant and engaged profiles appeared to reflect more balanced and recreational use of gaming, with strong offline support networks and lower distress.</p>
<p>“One important takeaway is that non-problematic gaming can be associated with individuals who maintain healthy emotional bonds and social relationships,” Castro and Neto told PsyPost. “In contrast, problematic gaming can emerge when individuals experience relational difficulties, whether through anxious attachment (fearing abandonment) or avoidant patterns (distancing from intimacy).”</p>
<p>“For some, games may become the preferred médium of interaction, especially when face-to-face connections feel threatening or overwhelming. In these cases, the gaming environment may offer a sense of structure, predictability, and a sense of control, but it may also reinforce avoidance and deepen isolation. This highlights the need for supportive and nuanced approaches.”</p>
<p>While the relational profile shared some risk factors with the dysregulated group, such as emotional difficulties and insecure attachment, it did not show the same level of psychological symptoms or substance use. The researchers interpret this group as selectively engaged with gaming in a way that offers social connection, possibly compensating for offline challenges without necessarily crossing into dysfunction.</p>
<p>The researchers were also surprised to find that in the avoidant profile, “players preferred to play alone (e.g., single-player games) yet showed secure attachment. It was also surprising the way that the dysregulated profile was associated with relational difficulties, particularly with both anxious and avoidant insecure attachment styles, as well as with some substance use.”</p>
<p>The study has several strengths, including its large and diverse sample and the use of well-validated psychological measures. However, the researchers caution that the findings are based on cross-sectional data, which limits their ability to draw conclusions about cause and effect. For example, it is unclear whether emotional dysregulation leads to problematic gaming, or whether excessive gaming worsens emotional difficulties. Longitudinal studies are needed to track how these profiles develop and change over time.</p>
<p>The researchers also note that the self-report nature of the study could introduce bias, as participants may underreport distress or overestimate their social engagement. Additionally, the study did not analyze specific gaming genres or content, which could influence motivations and psychological impacts.</p>
<p>“As with many psychological studies, our data are self-reported and based on a convenience sample, which can introduce bias and limit generalisability,” Castro and Neto noted. “Additionally, the cross-sectional nature of the data constrains causal interpretation. That said, although the size and diversity of the sample are considerable, we see this work as a first step in a broader line of research.”</p>
<p>Their long-term goal is to provide tools for clinicians and policymakers to recognize diverse gaming behaviors and offer tailored interventions. They also hope the findings can contribute to more inclusive strategies for promoting healthy gaming habits.</p>
<p>“We are currently finalizing a longitudinal study that follows these profiles over time, which will allow us to draw stronger inferences and examine how gaming behaviors evolve in relation to psychological variables,” Castro and Neto said. “Our long-term goal is to support clinicians with evidence-based tools that can inform assessment and intervention. This includes developing profile-based guidance for tailored approaches. We also hope that this work can be the beginning of studies that help shape policy and public health strategies, respecting the complexity of gaming behavior with inclusivity (e.g., by including non-binary individuals).”</p>
<p>“We would like to thank all participants who contributed to this study, and to ISPA and APPSYCI Research Center. Working together has been a deeply enriching process, combining research and clinical thinking. Cátia, PhD candidate, who also works as a clinical psychologist, and David, PhD supervisor and experienced researcher, who also works as a clinical psychologist and psychotherapist. We hope this study adds value to both the scientific community and the people it aims to support.”</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>