<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-uncover-dozens-of-genetic-traits-that-depend-on-which-parent-you-inherit-them-from/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists uncover dozens of genetic traits that depend on which parent you inherit them from</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 20th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1038/s41586-025-09357-5" target="_blank">Nature</a></em> provides evidence that some genetic traits are shaped not just by which variant a person inherits, but by which parent they inherit it from. By analyzing genetic data from over 100,000 individuals in the UK Biobank, researchers identified more than 30 instances in which the effects of a genetic variant depended on whether it came from the mother or the father. These “parent-of-origin effects” were especially common for traits related to growth and metabolism, including height, fat distribution, and risk for type 2 diabetes.</p>
<p>The study was motivated by a longstanding hypothesis in evolutionary biology known as the parental conflict theory, which proposes that mothers and fathers may have different evolutionary incentives when it comes to how much biological investment is made in their offspring.</p>
<p>For example, paternally inherited genes may favor greater offspring growth to enhance survival and reproductive success, while maternally inherited genes may promote more conservative energy use to preserve resources for future pregnancies. This evolutionary tension is thought to give rise to “genomic imprinting,” in which certain genes are expressed only when inherited from a specific parent.</p>
<p>While earlier studies have uncovered a handful of parent-specific effects, many were constrained by the requirement to have genetic information from both parents. This limitation sharply reduced sample sizes and made it difficult to detect more subtle but widespread effects across the genome. To overcome this, the researchers developed a new method that can infer the parental origin of a gene without direct access to parental genomes.</p>
<p>“We still remember how fascinated we were when we first read the <a href="https://doi.org/10.1038/nature08625" target="_blank">groundbreaking study by Kong and colleagues</a> from Iceland in 2009, which had shown that parent-of-origin effects (POEs) exist on complex traits and diseases,” explained study authors Zoltán Kutalik, an associate professor, and Robin Hofmeister, a postdoctoral researcher.</p>
<p>“However, those studies were limited by small sample sizes due to the requirement of parental genomes or genealogy being available. In 2014, our group (<a href="https://wp.unil.ch/sgg/" target="_blank">the Statistical Genetics Group</a> at the University of Lausanne and Unisanté) developed a new approach to indirectly infer parent-of-origin effects without parents, by examining phenotypic variance in people with various constellations of genetic variants.”</p>
<p>“Then in 2022, Robin Hofmeister, during his PhD with Olivier Delaneau, came up with the idea to leverage shared DNA segments between relatives to infer ‘surrogate parents,’ which allowed the identification of the parental origin of certain DNA variations. When Robin joined the Statistical Genetics Group, we expanded on this approach by incorporating new biobank data and massively improving the method.”</p>
<p>The researchers combined several strategies—such as tracking shared DNA segments between siblings, analyzing sex chromosomes, and using mitochondrial DNA—to estimate whether a given allele came from the mother or the father. These methods enabled them to assign parental origin to genetic variants for over 109,000 white British participants in the UK Biobank, increasing the usable sample size more than fourfold compared to earlier efforts.</p>
<p>They also applied their approach to two other large cohorts: 85,050 individuals from the Estonian Biobank and 42,346 children from the Norwegian Mother, Father and Child Cohort Study. These replication datasets helped validate their findings and assess how these effects might manifest across development.</p>
<p>Their analysis uncovered more than 30 parent-of-origin effects across a range of complex traits. In many instances, the same genetic variant had opposite effects depending on its parental origin. For example, one variant on chromosome 7 increased triglyceride levels when inherited from the father but decreased them when inherited from the mother. These opposite-direction effects—called “bipolar” effects—were especially common for traits related to energy regulation, such as fat percentage, glucose levels, and cholesterol.</p>
<p>One striking example was found in a well-known region on chromosome 11 that includes the IGF2 gene, which is involved in growth. A variant in this region showed that the paternal copy was associated with shorter height, while the maternal copy had no significant effect. Another nearby variant exhibited a maternal-specific influence on standing height and also affected traits such as fat-free mass and basal metabolic rate. These findings indicate that parent-specific genetic influences may help shape body size and composition across the lifespan.</p>
<p>In another case, a variant on chromosome 11 was associated with type 2 diabetes in a parent-specific manner: the paternal version increased risk, while the maternal version was protective. While this had been suggested in earlier research, the new study provided the strongest confirmation yet.</p>
<p>The presence of multiple bipolar effects in traits related to energy use and growth lends support to the parental conflict hypothesis. Across the identified cases, paternal alleles tended to promote growth and energy expenditure, while maternal alleles were more likely to limit these traits. This pattern was especially apparent for traits such as birth weight, height, and metabolic biomarkers, echoing predictions from evolutionary theory.</p>
<p>To ensure these findings were robust and not the result of random variation or imbalanced datasets, the researchers introduced a new statistical framework that formally tests whether maternal and paternal effects differ significantly. Their use of rigorous thresholds and independent replication helped provide greater confidence in the results.</p>
<p>By analyzing longitudinal data from the Norwegian cohort, the team was also able to track how these effects emerge over time. In one case, a variant associated with height showed parent-of-origin effects from infancy into early childhood. Another variant linked to body mass index showed a maternal-specific effect during infancy that reversed direction in adulthood, suggesting that the impact of parental origin may shift over developmental stages.</p>
<p>“What is key in our findings is that it is not only the DNA sequence that we inherit from our parents, but small attached chemical structures (so-called methylation marks or imprints), which also have an impact on human characteristics,” Kutalik and Hofmeister told PsyPost. “Because of this, we have to consider not only the actual genomic sequence that we inherited, but also from which parent we inherited it, as the effects can differ substantially.”</p>
<p>“Our study identified 30 such examples where the parent of origin of DNA variants matters, and 19 of these had conflicting effects: when inherited from one parent, a variant may predispose us to a cardiometabolic disease, but when the same DNA sequence is inherited from the other parent, it can be protective against that same disease,” they explained. “What surprised us most was how often we observed opposing parental effects—suggesting that this is more the rule than the exception.”</p>
<p>“This last observation was the most surprising: how often we observe opposing parental effects, which is consistent with the parental conflict hypothesis—whereby mothers aim to preserve resources for their own survival and future reproduction, while fathers may prioritize enhancing the fitness of the current offspring, even at the cost of maternal resources.”</p>
<p>Despite the scale and innovation of the study, there are limitations. The researchers focused on individuals of white European ancestry, which limits generalizability to more diverse populations. The traits examined were mainly physical and metabolic; psychiatric and behavioral traits were not analyzed in detail, though the authors note that their methods could be applied in future studies.</p>
<p>Another limitation is that their approach identifies only the transmitted allele, making it difficult to disentangle true genetic effects from potential environmental influences tied to parenting. For instance, a maternal allele linked to lower body mass index might reflect a biological effect, a behavioral influence, or a combination of both.</p>
<p>The team now plans to explore the molecular mechanisms behind these effects. Many of the observed patterns suggest differences in gene expression depending on the parental source, possibly due to regulatory elements that respond to imprinting marks. Future work may involve transcriptomic and epigenetic studies to understand these mechanisms more deeply.</p>
<p>They also hope to extend this line of research to psychiatric traits, which are harder to study due to complex environmental influences and smaller datasets. As access to diverse and multi-generational biobanks grows, the possibility of uncovering parent-specific effects on cognition, emotion, and mental health may soon become more feasible.</p>
<p>“Collaboration was really essential—not only to boost sample size but to assess whether these effects are generalizable across populations, thanks to the Estonian Biobank, and to track whether these patterns emerge in early life, thanks to the Mother, Father and Child Cohort Study,” Kutalik and Hofmeister added.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41586-025-09357-5" target="_blank">Parent-of-origin effects on complex traits in up to 236,781 individuals</a>,” was authored by Robin J. Hofmeister, Théo Cavinato, Roya Karimi, Adriaan van der Graaf, Fanny-Dhelia Pajuste, Jaanika Kronberg, Nele Taba, the Estonian Biobank research team, Reedik Mägi, Marc Vaudel, Simone Rubinacci, Stefan Johansson, Lili Milani, Olivier Delaneau, and Zoltán Kutalik.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/green-tea-antioxidant-and-vitamin-b3-show-promise-for-treating-alzheimers-related-cellular-decline/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Green tea antioxidant and vitamin B3 show promise for treating Alzheimer’s-related cellular decline</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 20th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Researchers at the University of California, Irvine have identified a promising method to counteract key biological features of brain aging and Alzheimer’s disease using naturally occurring compounds. In a study published in <em><a href="https://doi.org/10.1007/s11357-025-01786-4" target="_blank">GeroScience</a></em>, the scientists found that combining nicotinamide—a form of vitamin B3—and a green tea antioxidant known as EGCG can restore levels of a key energy molecule in neurons and stimulate a cellular cleanup process that helps remove toxic protein build-up.</p>
<p>The rationale behind this research centers on a specific energy molecule called guanosine triphosphate, or GTP. GTP plays a central role in powering essential neuronal processes, including the transport and clearance of damaged proteins. Aging is known to cause a general decline in cellular energy levels, including GTP, but most research has focused on another molecule, ATP. </p>
<p>The researchers suspected that GTP might be a previously underappreciated factor in Alzheimer’s disease, especially since GTP powers autophagy—a process by which cells break down and recycle damaged components, including toxic proteins like amyloid-beta. With this in mind, they set out to investigate whether restoring GTP levels in aged neurons could improve autophagy and reduce protein accumulation.</p>
<p>“The strongest risk factor for Alzheimer’s that everyone knows is age. As we age, we have less energy. We developed a technique to measure a particular kind of energy in brain cells called GTP,” said study author <a href="https://brewer.eng.uci.edu/" target="_blank">Gregory J. Brewer</a>, a professor of biomedical engineering at the University of California, Irvine.</p>
<p>“We saw in mouse neurons that GTP levels were lower in old age. This led us to try to raise GTP levels with an energy precursor molecule that’s very safe, nicotinamide. At the same time, as our bodies age, we build up damaged DNA, lipids and proteins from oxidation (like rust of iron). This is worsened in Alzheimer’s. So I wondered if a wildly safe and known antioxidant compound found in green tea called EGCG would help with the oxidation problem.”</p>
<p>To test their hypothesis, the researchers conducted a series of experiments using cultured neurons taken from a well-established mouse model of Alzheimer’s disease known as the 3xTg-AD mouse. These mice carry human genes associated with Alzheimer’s and develop hallmark features of the disease, including intracellular amyloid-beta aggregates. The researchers also used neurons from healthy non-transgenic mice as controls. Neurons were isolated from mice at three age ranges—young (2–6 months), middle-aged (8–11 months), and old (17–28 months)—to track how GTP levels change over time and across genotypes.</p>
<p>The scientists used a specialized biosensor, GEVAL, that allowed them to measure free and bound GTP inside living neurons. Their results showed that GTP levels declined with age in both healthy and Alzheimer’s-model neurons, but the decline was steeper and occurred earlier in the Alzheimer’s neurons. In non-Alzheimer’s neurons, free GTP levels rose slightly in middle age before falling in old age. In contrast, Alzheimer’s neurons exhibited a significant loss of GTP by middle age, and levels remained low into old age.</p>
<p>In healthy young neurons, much of the GTP was found in the mitochondria—the cell’s energy factories—suggesting that this is where it is produced and used most actively. However, in old neurons, particularly those from Alzheimer’s mice, mitochondrial GTP levels fell sharply. At the same time, GTP became trapped in abnormal vesicle-like structures, hinting at a breakdown in the cell’s ability to use GTP for normal processes like autophagy and endocytosis.</p>
<p>To determine whether this energy shortfall could be reversed, the researchers treated old neurons with a combination of nicotinamide and EGCG. Nicotinamide boosts levels of NAD+, a key molecule involved in energy metabolism, while EGCG activates Nrf2, a transcription factor that regulates antioxidant defenses and helps maintain redox balance in cells. Together, these compounds aim to support both the production of cellular energy and the control of oxidative stress.</p>
<p>After just 16 hours of treatment, the researchers observed a restoration of GTP levels in aged neurons, bringing them close to levels seen in young cells. This increase was associated with a drop in the number and size of GTP-bound vesicles, suggesting improved cellular function. Importantly, the treatment also led to enhanced activity of key GTP-dependent proteins involved in vesicle trafficking, including Rab7 and Arl8b, which are essential for transporting toxic waste to the cell’s lysosomes for disposal.</p>
<p>The restoration of GTP and vesicular activity had additional downstream effects. In Alzheimer’s-model neurons, the treatment led to a reduction in intracellular amyloid-beta aggregates. It also reduced markers of oxidative protein damage, such as tyrosine nitration, which tend to increase with age and neurodegeneration. These findings indicate that improving GTP levels not only revives cellular housekeeping functions but may also lower the burden of toxic proteins that contribute to Alzheimer’s pathology.</p>
<p>“I was surprised how well the combination of nicotinamide and EGCG worked to clear an important protein in Alzheimer’s called amyloid and to lower oxidized proteins,” Brewer told PsyPost.</p>
<p>To confirm that the treatment was activating the intended cellular pathways, the researchers tracked the location of Nrf2, which is normally kept in the cytoplasm. After treatment, Nrf2 rapidly moved into the nucleus—an indication that it was being activated—and increased the expression of antioxidant genes such as NQO1. This response was swift, peaking within 30 minutes, suggesting that the compounds were having a rapid and coordinated effect on the cells’ stress-response systems.</p>
<p>The study also explored how GTP was used during autophagy. When the researchers blocked autophagy with a compound called bafilomycin, they observed a buildup of free GTP, indicating that autophagy normally consumes GTP. In contrast, stimulating autophagy with rapamycin led to GTP depletion, particularly in neurons from healthy mice. This further supports the idea that impaired GTP availability in aging may hinder autophagy and contribute to protein buildup.</p>
<p>The researchers also found that the Alzheimer’s neurons had greater accumulations of Rab7 and Arl8b—proteins that mark vesicles in the process of autophagy. This accumulation suggests a backlog in the system, possibly due to insufficient GTP to complete the recycling process. Treatment with nicotinamide and EGCG reduced these accumulations, suggesting that improving energy supply helped clear the logjam.</p>
<p>Another notable finding was that treatment improved overall neuronal viability in old Alzheimer’s neurons by about 22 percent. This increase in survival suggests that restoring GTP levels may not only clear harmful proteins but also support the overall health of aging brain cells.</p>
<p>“These compounds are available in your vitamin section of your grocery store as supplements, but other studies indicate that taking these supplements orally doesn’t work because they get inactivated in the blood,” Brewer said. “Therefore, new ways are needed to get them to the brain more directly.”</p>
<p>Despite the promising results, the researchers emphasize that their study was conducted in mouse neurons <em>in vitro</em>. This setup allows for tight experimental control but does not fully replicate the complexity of a living brain, which includes glial cells, blood vessels, immune responses, and other factors that influence disease progression. Additional studies in live animals and human cells are needed to confirm these findings and explore their potential clinical applications.</p>
<p>“These studies were done in mouse neurons in a dish,” Brewer noted. “They need to be confirmed in human neurons and in randomized, placebo controlled blinded trials. Also, these drugs have been given orally in human trials of Alzheimer’s and not succeeded because they were so quickly inactivated in the blood.”</p>
<p>The study, “<a href="https://doi.org/10.1007/s11357-025-01786-4" target="_blank">Treatment of age-related decreases in GTP levels restores endocytosis and autophagy</a>,” was authored by R. A. Santana, J. M. McWhirt, and G. J. Brewer.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/people-high-in-psychopathy-and-low-in-cognitive-ability-are-the-most-politically-active-online-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">People high in psychopathy and low in cognitive ability are the most politically active online, study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 20th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People with certain dark personality traits may be especially drawn to online political participation — particularly if they also experience fear of missing out (FoMO) and have lower cognitive ability. A new cross-national study published in <em><a href="https://doi.org/10.1057/s41599-025-05195-y" target="_blank">Humanities and Social Sciences Communications</a></em> examined how several traits interact to predict digital political engagement across eight countries. The observed patterns suggest that emotion-driven traits, rather than deliberative thinking, may play a key role in motivating online political behavior.</p>
<p>Most research on personality and political participation has focused on broad traits like the Big Five. But less attention has been paid to how darker traits—like psychopathy and narcissism—might shape who participates in politics, especially in the digital realm. Previous findings have shown that individuals high in these traits often seek power and attention, both of which can be satisfied through political engagement. Yet, studies have mainly focused on offline behavior and Western populations.</p>
<p>The rise of social media has opened new channels for political action. It has also created environments that reward impulsivity, emotional expression, and the desire for attention—traits closely tied to psychopathy, narcissism, and FoMO. At the same time, people vary in their ability to critically evaluate information, raising the question of whether cognitive ability moderates how personality traits translate into political participation. The researchers aimed to explore how these psychological and cognitive factors interact to shape online political behavior across cultures.</p>
<p>“I’m particularly interested in how citizens engage with political information on social media platforms, especially in ways that translate into online political activities such as discussion and participation,” explained study author Saifuddin Ahmed, an assistant professor at Nanyang Technological University and director of <a href="https://www.ntu.edu.sg/incube/about-us/our-labs/smpe" target="_blank">the Social Media and Political Engagement (SMAPE) Lab</a>.</p>
<p>“When analysing such behaviour, it becomes essential to ask not only how are individuals participating but who is participating as well. This is because participation is far from uniform, differences in personality traits and cognitive styles are likely to play a significant role. Therefore, in this paper, we aim to examine how personality differences distinguish those who are more active from those who are less active in online political activities.”</p>
<p>The researchers conducted a large-scale survey in June 2022 using data from over 8,000 participants in eight countries: the United States, China, Singapore, Indonesia, Malaysia, the Philippines, Thailand, and Vietnam. They used quota sampling to match national distributions of age and gender, and the survey was administered in local languages where appropriate. Participants completed validated questionnaires assessing psychopathy, narcissism, fear of missing out (FoMO), and cognitive ability.</p>
<p>Online political participation was measured by asking how often participants engaged in six different online political activities (such as commenting on political posts or sharing political content) over the past year. Psychopathy and narcissism were assessed using a short version of the Dark Triad scale. FoMO was measured through a series of questions asking about anxiety related to missing out on others’ experiences. Cognitive ability was assessed using the Wordsum test, a vocabulary-based measure often used as a proxy for general intelligence.</p>
<p>To better isolate the effects of personality and cognitive traits, the researchers also controlled for age, gender, education, income, political interest, and both traditional and social media news consumption.</p>
<p>Across all eight countries, higher levels of psychopathy and FoMO were consistently linked to greater online political participation. In contrast, narcissism was only associated with political participation in three countries: the United States, the Philippines, and Thailand. This suggests that psychopathy and FoMO may be more universal predictors of online political activity, while the influence of narcissism could be shaped by cultural norms.</p>
<p>Cognitive ability, on the other hand, showed a consistent negative relationship with online political participation. In all countries studied, individuals with higher cognitive scores were less likely to engage in online political activities. This pattern was especially strong in Singapore and Malaysia.</p>
<p>The researchers also examined how cognitive ability influenced the relationships between personality traits and political participation. Among individuals with lower cognitive ability, the link between psychopathy and political engagement was stronger in five countries: the United States, China, Singapore, Malaysia, and the Philippines. A similar pattern was found for FoMO in several countries. In these contexts, people who scored high in psychopathy or FoMO and low in cognitive ability were the most active in online politics.</p>
<p>The one notable exception was China. In that country, psychopathy was more strongly associated with political engagement among individuals with higher cognitive ability. This indicates that the dynamics between personality, cognition, and political behavior can differ across national or cultural settings.</p>
<p>“The most important takeaway from this study is that psychopathy and fear of missing out (FoMO) are strong and consistent predictors of online political participation across countries,” Ahmed told PsyPost. “Moreover, the combination of high psychopathy and low cognitive ability appears to drive the highest levels of engagement, highlighting that those most active in online political spaces are often motivated less by civic-mindedness and more by psychological and cognitive factors. This underscores the importance of considering personality-cognition interactions when understanding who participates in digital political discourse and why.”</p>
<p>The results suggest that emotional and impulsive traits may drive some people to become more involved in online politics, particularly if they lack strong critical thinking skills. People high in psychopathy may be attracted to the combative, attention-seeking nature of digital political discourse. Those high in FoMO may engage politically online to avoid feeling left out, even if they lack a deep interest in political issues.</p>
<p>Meanwhile, individuals with higher cognitive ability may be more cautious or selective in their online political engagement. They may be better at evaluating the quality of information, recognizing misinformation, or assessing the risks of political expression in online spaces. This could explain why cognitive ability weakens the effect of dark personality traits on political participation in most countries.</p>
<p>In collectivist societies like China, Singapore, Malaysia, and Vietnam, narcissism was not significantly linked to political participation. This may reflect cultural norms that discourage overt self-promotion or public displays of individual importance. In contrast, more individualistic societies like the United States and the Philippines may provide a more fertile ground for narcissists to seek attention through political activity.</p>
<p>The findings raise questions about the nature and quality of online political participation. If people who are more impulsive, emotionally driven, or self-focused are also more politically active online, what does this mean for democratic discourse in digital spaces? Previous research suggests that individuals high in psychopathy and narcissism are more likely to spread misinformation, engage in online harassment, and promote extreme views. This study adds to the concern that these traits may disproportionately influence online political conversations.</p>
<p>The strong role of FoMO also suggests that some political participation may be driven less by civic interest and more by anxiety about being left out. While this could broaden participation and attract younger people into political discourse, it also raises the risk that such engagement is superficial, reactive, or vulnerable to manipulation.</p>
<p>The consistent negative association between cognitive ability and online political activity may seem counterintuitive, especially given that political engagement is often considered a sign of informed citizenship. But in digital spaces, where barriers to participation are low and emotionally charged content spreads quickly, participation may not always reflect informed deliberation. Instead, it may be driven by impulsivity, sensation-seeking, or a desire for visibility.</p>
<p>“The findings align with existing knowledge on the relationship between psychopathy and narcissism in offline political participation,” Ahmed said. “However, the consistency of fear of missing out as a predictor across contexts, as well as the moderating effect of cognitive ability across contexts, were surprising outcomes.”</p>
<p>This study offers a comprehensive look at how personality and cognitive traits shape online political behavior, but it is not without limitations. The measures of psychopathy and narcissism were based on self-report, which can be subject to bias. The Wordsum test, while widely used, captures only one dimension of cognitive ability. The study also did not distinguish between different types of online political participation—such as constructive engagement versus spreading misinformation.</p>
<p>“Readers should be mindful of the limitations inherent in the analytical approach and the measures employed,” Ahmed noted. “Although every effort was made to validate these measures, all survey-based tools, not only this study, carry certain constraints. Even so, the findings remain valuable in shedding light on how personality and cognitive factors shape patterns of political engagement online.”</p>
<p>Future research could examine how specific subtypes of narcissism and psychopathy relate to different forms of political engagement. Longitudinal or experimental designs could help clarify whether these traits predict changes in behavior over time or whether online participation feeds back into personality development. Cultural differences also warrant closer study, particularly regarding how social norms and media environments influence the psychological roots of political behavior.</p>
<p>“We are now exploring the effects of dark triads in other forms of political engagements,” Ahmed added. “The findings raise important questions about the nature and quality of online political participation, as well as what might be required to foster a more balanced and inclusive form of civic engagement.” </p>
<p>“If certain personality traits and cognitive factors disproportionately drive participation, there is a risk that online political spaces may overrepresent voices shaped by these traits, potentially influencing the tone, content, and polarization of discourse. This has implications not only for understanding who participates, but also for how democratic dialogue is shaped in digital environments.”</p>
<p>The study, “<a href="https://doi.org/10.1057/s41599-025-05195-y" target="_blank">Dark personalities in the digital arena: how psychopathy and narcissism shape online political participation</a>,” was authored by Saifuddin Ahmed and Muhammad Masood.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-common-painkiller-triggered-hallucinations-mistaken-for-schizophrenia/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A common painkiller triggered hallucinations mistaken for schizophrenia</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 19th 2025, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent case report published in the medical journal <em><a href="https://www.cureus.com/articles/290598-opioid-induced-hallucinations-a-case-report#!/" target="_blank" rel="noopener">Cureus</a></em> describes how a man was mistakenly diagnosed with schizophrenia after developing hallucinations. The episodes, both visual and auditory, began shortly after he increased his prescribed dose of Norco, a combination of hydrocodone and acetaminophen, to manage chronic back pain. According to the patient’s account, the hallucinations stopped entirely when he discontinued the medication, raising questions about how often opioid-induced side effects are misinterpreted as signs of a serious psychiatric disorder.</p>
<p>This case provides a window into how the effects of prescription opioids can sometimes mimic primary psychotic disorders. It also illustrates how clinical context—such as medication history, timing of symptom onset, and prior psychiatric status—can help distinguish between drug-induced symptoms and conditions like schizophrenia.</p>
<p>The patient, a 67-year-old African American man, had a wide range of serious medical conditions. His history included congestive heart failure, coronary artery disease, hypertension, peripheral artery disease, chronic hepatitis C (complicated by hepatic coma), gastroesophageal reflux disease, chronic midline back pain, and spinal stenosis at the L4-L5 level. He was also a chronic tobacco user and reported very occasional cannabis use—only one or two hits every couple of months. His daily medications included baclofen, pantoprazole, metoprolol, gabapentin, and trazodone.</p>
<p>He had no known psychiatric history and no family history of mental illness or dementia. That changed at age 63, when he was hospitalized for a seizure. Roughly 20 days after the seizure, he began experiencing visual hallucinations—he reported seeing people trying to attack him and animals that weren’t present. He also exhibited paranoia. These symptoms led to his diagnosis with schizophrenia, unspecified, and he was admitted to an inpatient psychiatric facility for stabilization.</p>
<p>He responded well to treatment with Seroquel (quetiapine), an antipsychotic. A dose of 100 mg at bedtime helped reduce his symptoms and also aided his sleep. After two years of stability, the dose was lowered to 50 mg per night. He remained psychiatrically stable for a time and had no ongoing hallucinations.</p>
<p>That changed when he presented again to the psychiatry clinic with a new wave of symptoms. He described feeling like someone was following him and hearing voices that were constantly in the background. These voices made derogatory comments and seemed intent on putting him down. While distressing, they never instructed him to harm himself or others. He also experienced visual hallucinations, including seeing small worms crawling across the roof of his house. He explicitly denied using marijuana or any recreational substances during this time.</p>
<p>He did, however, report that his chronic back pain had worsened. In response, he increased his intake of Norco—an opioid painkiller composed of hydrocodone and acetaminophen—to four tablets per day, which was still within the prescribed range. He noticed a clear correlation: the more Norco he took, the more vivid and intense the hallucinations became.</p>
<p>Recognizing the possibility of a medication-related issue, he decided on his own to stop taking Norco. After discontinuing the opioid, his hallucinations completely resolved. He has not experienced any further episodes of paranoia, auditory hallucinations, or visual hallucinations since stopping the medication.</p>
<p>This sequence of events—hallucinations emerging only after increasing his opioid dosage and disappearing after stopping the drug—suggests that the symptoms were not a recurrence of schizophrenia but rather an adverse effect of the opioid medication. Notably, the patient had been psychiatrically stable on a low dose of Seroquel, and there had been no other changes to his medication regimen or lifestyle that might have explained the sudden onset of hallucinations.</p>
<p>This pattern suggests the episodes were likely not a recurrence of schizophrenia but rather an adverse reaction to the opioid medication. Yet the symptoms were serious enough to warrant concern, highlighting how difficult it can be to tease apart psychiatric illness from medication effects, especially in medically complex patients.</p>
<p>It is important to note that case reports describe the experience of a single individual. This format lacks the statistical power to establish causality or generalize findings across populations. Case reports do not control for confounding variables, nor can they definitively rule out alternative explanations.</p>
<p>Still, case reports serve an essential role in medicine by drawing attention to rare or underrecognized phenomena. They can prompt further research, guide clinical awareness, and help prevent misdiagnosis. In this case, the authors suggest that greater attention to a patient’s medication history—including the use of opioids—could help clinicians avoid mistakenly labeling drug-induced symptoms as chronic psychiatric conditions.</p>
<p>Although not commonly discussed, hallucinations <a href="https://doi.org/10.1213/ANE.0000000000001417" target="_blank" rel="noopener">are a documented side effect of various opioid medications</a>. Previous studies have reported that up to 6% of patients using fentanyl for postoperative pain experienced hallucinations. These may be underreported due to stigma, fear of being perceived as mentally ill, or because the hallucinations are mild and transient.</p>
<p>Opioids, especially when taken at higher doses or over long periods, can affect the brain’s dopamine system, which plays a role in perception and reward. Dopamine dysregulation has been implicated in both opioid-induced hallucinations and in conditions such as schizophrenia. This shared mechanism may partly explain the overlap in symptoms, even though the underlying causes differ.</p>
<p>Research also indicates that different opioids may have different risks. Morphine has been the most commonly reported opioid linked to hallucinations, but others—including hydromorphone, tramadol, methadone, and buprenorphine—have also been implicated. The neurotoxic effects may arise from the opioid itself or from its metabolites, which can vary in how they interact with brain receptors.</p>
<p>In some cases, changing the type of opioid—a process known as opioid rotation—can eliminate hallucinations while still managing pain. Other strategies include dose reduction or the addition of non-opioid pain treatments. When hallucinations do occur, they can sometimes be managed with antipsychotic medications, though this may not be ideal if the underlying problem is a reversible drug effect.</p>
<p>The authors of the case study point out that schizophrenia is typically a disorder of early adulthood, with most diagnoses occurring before the age of 40. A sudden onset of hallucinations and paranoia in a man in his mid-60s, especially without a family history of mental illness, should raise suspicion about alternative causes.</p>
<p>The patient’s first experience of hallucinations followed a seizure, which may have triggered postictal psychosis—a known but temporary condition. His more recent hallucinations appeared only after increasing his Norco dose and resolved completely after stopping it. These details suggest the need for caution when diagnosing a lifelong psychiatric disorder based on symptoms that may have other explanations.</p>
<p>This case also touches on the broader issue of polypharmacy in older adults, who often take multiple medications for various chronic conditions. Interactions between drugs, along with age-related changes in metabolism and brain function, can increase the risk of adverse effects, including neuropsychiatric symptoms.</p>
<p>The report, “<a href="https://www.cureus.com/articles/290598-opioid-induced-hallucinations-a-case-report#!/" target="_blank" rel="noopener">Opioid-Induced Hallucinations: A Case Report</a>,” was authored by Arvind Dhanabalan, Sall Saveen, Christina Singh, Ramona Ramasamy, and Keerthiga Raveendran.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/stronger-amygdala-control-network-connectivity-predicts-impulsive-choices-in-older-adolescents/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Stronger amygdala-control network connectivity predicts impulsive choices in older adolescents</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 19th 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1002/hbm.70270" target="_blank" rel="noopener">Human Brain Mapping</a></em> sheds light on the brain mechanisms that influence how adolescents weigh immediate versus delayed rewards. The research focused on dynamic brain connectivity and found that stronger functional links between the left amygdala and the brain’s cognitive control network predicted a greater preference for immediate rewards—but only in older adolescents faced with large monetary decisions.</p>
<p>This pattern did not appear in younger adolescents or young adults. The findings provide evidence that unique patterns of brain function during late adolescence may drive heightened reward-seeking tendencies, especially when the stakes are high.</p>
<p>Delay discounting refers to how much a person devalues a reward depending on how long they have to wait for it. For example, someone might prefer receiving $100 today over $150 in six months. This behavior is often used as a measure of impulsivity or self-control. The lower a person’s ability to wait, the steeper their delay discounting curve. Researchers use this measure to examine decision-making, especially in contexts like addiction, risk-taking, and adolescence, a period known for rapid changes in the brain’s reward and control systems.</p>
<p>Adolescents are known to be more sensitive to rewards compared to children and adults. Brain imaging studies have shown that regions like the amygdala and ventral striatum become more responsive to potential rewards during adolescence. At the same time, the regions responsible for executive control—mainly within the frontoparietal network—are still maturing.</p>
<p>This imbalance, where reward systems are fully online but self-regulatory systems are still developing, has been proposed as one explanation for the increase in risky or impulsive behavior during this period of life. The current study set out to examine how the communication between these two systems—the emotional salience system (amygdala) and cognitive control network—might help predict individual differences in delay discounting behavior across development.</p>
<p>“Adolescence is a time of major changes at both behavioral and brain levels,” said study author <a href="https://www.boystownresearch.org/researchers-and-staff/gaelle-doucet" target="_blank" rel="noopener">Gaelle Doucet</a>, director of the Brain Architecture, Imaging and Cognition Lab at Boys Town National Research Hospital.</p>
<p>“Increased risk taking and reward sensitivity are among these changes; however, their neural origins remain unclear. In this context, we wanted to investigate the neural activity of specific brain regions involved in emotional regulation and executive function, that typically play a role in these cognitive functions and how their involvement may change throughout adolescence and early adulthood. Such findings could help us understand better why some teenagers take more risk than others.”</p>
<p>The research team used data from 448 participants aged 10 to 21 who were part of the Human Connectome Project – Development. Participants were divided into three age groups: younger adolescents (10–13 years), older adolescents (14–17 years), and young adults (18–21 years).</p>
<p>Each participant completed a delay discounting task, in which they chose between a smaller amount of money available immediately and a larger amount available after a delay. This was done across two reward sizes—$200 and $40,000—with six different time delays ranging from one month to ten years.</p>
<p>The choices allowed researchers to calculate each individual’s “area under the curve” (AUC), a common metric in delay discounting research. A lower AUC indicates steeper discounting, meaning a stronger preference for immediate rewards.</p>
<p>To examine brain function, resting-state fMRI data were collected and analyzed using a sliding window approach. This method breaks up the brain scan into short time segments and calculates functional connectivity for each one. The researchers focused on connectivity between the amygdala and 52 regions of the cognitive control network.</p>
<p>Importantly, the researchers used dynamic functional connectivity (dFC), a method that captures how connections between brain regions fluctuate over time, rather than assuming they are stable throughout a scan. This allowed them to ask not just whether two regions are connected, but how that connection varies—and whether that variability matters for behavior.</p>
<p>Overall, participants in all age groups showed a stronger preference for immediate rewards in the $200 condition compared to the $40,000 condition. But age made a difference. Younger adolescents were significantly more likely than older adolescents and young adults to favor immediate rewards in the high-value condition. This supports previous work suggesting that sensitivity to reward peaks early in adolescence and gradually declines.</p>
<p>However, when it came to predicting individual differences in delay discounting behavior from brain connectivity, a more specific pattern emerged. Only among older adolescents (aged 14–17) did dynamic functional connectivity between the left amygdala and the cognitive control network significantly predict behavior on the task. Specifically, those with stronger connectivity were more likely to choose the immediate reward in the $40,000 condition.</p>
<p>No such association was found for the right amygdala. Nor did the connectivity metrics predict behavior in younger adolescents or young adults. This suggests that during a particular developmental window—late adolescence—the dynamic interaction between emotion-processing and cognitive control systems may play a larger role in shaping reward-based decision-making, especially for large potential gains.</p>
<p>“Our findings confirmed a difference in sensitivity to large monetary reward between early and late adolescence, with younger adolescents preferring smaller but immediate reward compared to older adolescents or adults,” Doucet told PsyPost. “We further revealed that older adolescents aged 14 to 17 years old showed a unique relationship between reward sensitivity and brain activity in specific regions related to emotion regulation and executive function. This relationship was not present in younger adolescents or young adults. This suggests that reward preference in older adolescents may be linked to a particular configuration of these brain regions which seem to go away as adolescents grow up.”</p>
<p>The finding that only the left amygdala was involved aligns with prior research indicating that it may play a stronger role than the right amygdala in tasks that involve executive function, decision-making, and reading emotional information. The left amygdala also tends to mature earlier and may be more sensitive to cognitive influences than its right-hemisphere counterpart.</p>
<p>Additionally, the most predictive connections in the study involved the medial and lateral prefrontal cortex, which are central components of the brain’s cognitive control system and have been repeatedly implicated in goal-directed behavior, self-control, and subjective valuation. A smaller number of connections also involved the inferior parietal lobule, which has been linked to numerical reasoning and mental computation—skills that may come into play during decisions involving tradeoffs over time.</p>
<p>One possible explanation for the age-specific pattern lies in the development of dopamine systems. During adolescence, dopamine receptor availability and connectivity within the prefrontal cortex change rapidly. Some researchers have proposed that these changes temporarily lead to a mismatch between elevated dopamine levels and still-developing executive control circuits, making this a particularly sensitive period for reward-seeking and impulsivity.</p>
<p>Older adolescents may be at a “tipping point,” where their prefrontal regions are becoming more connected and dynamic, but not yet fully mature. This could create a situation where brain systems involved in evaluating rewards and exerting control are especially active—but not always in ways that support long-term decision-making.</p>
<p>“It is important to keep in mind that even typical adolescence is associated with higher risk taking which is thought to be a normal part of development,” Doucet said.</p>
<p>As with all research, there are some limitations to consider. The study focused solely on the amygdala and did not include other important reward-related regions like the nucleus accumbens. Although the amygdala plays a key role in emotional salience and reward evaluation, a broader network of subcortical structures is also likely involved in delay discounting behavior.</p>
<p>Future studies could examine other types of rewards as well as real-world risk-taking behaviors. “While we used a test that estimated sensitivity to monetary reward, it will be important to test responses to other types of reward (social acceptance, substance use),” Doucet explained. “One of the long-term goals is to understand the mechanisms behind risk taking and higher reward sensitivity in adolescents that may lead to more dangerous behaviors. This knowledge could lead to a better insight on why some young people will turn to alcohol and drug use.”</p>
<p>The study, “<a href="https://doi.org/10.1002/hbm.70270" target="_blank" rel="noopener">Dynamic Functional Connectivity Between Amygdala and Cognitive Control Network Predicts Delay Discounting in Older Adolescents</a>,” was authored by Attakias T. Mertens, Callum Goldsmith, Derek J. Pavelka, Jacob J. Oleson, and Gaelle E. Doucet.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/ancient-laws-and-modern-minds-agree-on-what-body-parts-matter-most/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Ancient laws and modern minds agree on what body parts matter most</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 19th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><div class="theconversation-article-body">
<p>The Bible’s lex talionis – “<a href="https://www.bible.com/bible/1/EXO.21.24-27">Eye for eye, tooth for tooth</a>, hand for hand, foot for foot” (Exodus 21:24-27) – has captured the human imagination for millennia. This idea of fairness has been a model for ensuring justice when bodily harm is inflicted.</p>
<p>Thanks to the work of <a href="https://utppublishing.com/doi/book/10.3138/9781442614833">linguists</a>, <a href="https://doi.org/10.1163/9789004466128_002">historians</a>, <a href="https://doi.org/10.1093/oxfordhb/9780197572528.013.25">archaeologists</a> and <a href="https://dx.doi.org/10.14288/1.0105993">anthropologists</a>, researchers know a lot about how different body parts are appraised in societies both small and large, from ancient times to the present day.</p>
<p>But where did such laws originate?</p>
<p>According to one school of thought, <a href="https://www.jstor.org/stable/41203513">laws are cultural constructions</a> – meaning they vary across cultures and historical periods, adapting to local customs and social practices. By this logic, laws about bodily damage would differ substantially between cultures.</p>
<p><a href="https://www.science.org/doi/10.1126/sciadv.ads3688">Our new study</a> explored a different possibility – that laws about bodily damage are rooted in something universal about <a href="https://search.worldcat.org/title/22860694">human nature</a>: shared intuitions about the value of body parts.</p>
<p>Do people across cultures and throughout history agree on which body parts are more or less valuable? Until now, no one had systematically tested whether body parts are valued similarly across space, time and levels of legal expertise – that is, among laypeople versus lawmakers.</p>
<p>We are <a href="https://www.sznycerlab.org/sznycer-lab/research">psychologists who study evaluative processes</a> and <a href="https://www.kremslab.com/">social interactions</a>. In previous research, we have identified regularities in how people evaluate different <a href="https://doi.org/10.1038/s41562-020-0827-8">wrongful actions</a>, <a href="https://doi.org/10.1073/pnas.1805016115">personal characteristics</a>, <a href="https://doi.org/10.1016/j.evolhumbehav.2020.02.003">friends</a> and <a href="https://doi.org/10.1016/j.evolhumbehav.2022.06.002">foods</a>. The body is perhaps a person’s most valuable asset, and in this study we analyzed how people value its different parts. We investigated links between intuitions about the value of body parts and laws about bodily damage.</p>
<h2>How critical is a body part or its function?</h2>
<p>We began with a simple observation: Different body parts and functions have different effects on the odds that a person will survive and thrive. Life without a toe is a nuisance. But life without a head is impossible. Might people intuitively understand that different body parts are have different values?</p>
<p>Knowing the value of body parts gives you an edge. For example, if you or a loved one has suffered multiple injuries, you could treat the most valuable body part first, or allocate a greater share of limited resources to its treatment.</p>
<p>This knowledge could also play a role in negotiations when one person has injured another. When person A injures person B, B or B’s family can claim compensation from A or A’s family. This practice appears around the world: among the <a href="https://cart.sbl-site.org/books/061506P">Mesopotamians</a>, the <a href="https://press.princeton.edu/books/paperback/9780691607801/the-tang-code-volume-ii">Chinese during the Tang dynasty</a>, the <a href="https://doi.org/10.1073/pnas.2014759117">Enga of Papua New Guinea</a>, the <a href="https://global.oup.com/academic/product/the-nuer-9780195003222">Nuer of Sudan</a>, the <a href="https://search.worldcat.org/title/14240365">Montenegrins</a> and many others. The Anglo-Saxon word “<a href="https://en.wikipedia.org/wiki/Weregild">wergild</a>,” meaning “man price,” now designates in general the practice of paying for body parts.</p>
<p>But how much compensation is fair? Claiming too little leads to loss, while claiming too much risks retaliation. To walk the fine line between the two, victims would claim compensation in Goldilocks fashion: just right, based on the consensus value that victims, offenders and third parties in the community attach to the body part in question.</p>
<p>This Goldilocks principle is readily apparent in the exact proportionality of the lex talionis – “eye for eye, tooth for tooth.” Other legal codes dictate precise values of different body parts but do so in money or other goods. For example, the <a href="https://en.wikipedia.org/wiki/Code_of_Ur-Nammu">Code of Ur-Nammu</a>, written 4,100 years ago in ancient Nippur, present-day Iraq, states that a man must pay <a href="https://cart.sbl-site.org/books/061506P">40 shekels of silver</a> if he cuts off another man’s nose, but only 2 shekels if he knocks out another man’s tooth.</p>
<h2>Testing the idea across cultures and time</h2>
<p>If people have intuitive knowledge of the values of different body parts, might this knowledge underpin laws about bodily damage across cultures and historical eras?</p>
<p>To test this hypothesis, we conducted a study involving 614 people from the United States and India. The participants read descriptions of various body parts, such as “one arm,” “one foot,” “the nose,” “one eye” and “one molar tooth.” We chose these body parts because they were featured in legal codes from five different cultures and historical periods that we studied: the <a href="https://utppublishing.com/doi/book/10.3138/9781442614833">Law of Æthelberht</a> from Kent, England, in 600 C.E., the <a href="https://vsnr.org/editions/guta-saga-the-history-of-the-gotlanders/">Guta lag</a> from Gotland, Sweden, in 1220 C.E., and modern workers’ compensation laws from the <a href="https://iga.in.gov/laws/2021/ic/titles/22#22-3">United States</a>, <a href="https://www.law.go.kr/lsBylInfoPLinkR.do?bylCls=BE&lsNm=%EC%82%B0%EC%97%85%EC%9E%AC%ED%95%B4%EB%B3%B4%EC%83%81%EB%B3%B4%ED%97%98%EB%B2%95+%EC%8B%9C%ED%96%89%EB%A0%B9&bylNo=0006&bylBrNo=00">South Korea</a> and the <a href="https://natlex.ilo.org/dyn/natlex2/r/natlex/fe/details?p3_isn=11956">United Arab Emirates</a>.</p>
<p>Participants answered one question about each body part they were shown. We asked some how difficult it would be for them to function in daily life if they lost various body parts in an accident. Others we asked to imagine themselves as lawmakers and determine how much compensation an employee should receive if that person lost various body parts in a workplace accident. Still others we asked to estimate how angry another person would feel if the participant damaged various parts of the other’s body. While these questions differ, they all rely on assessing the value of different body parts.</p>
<p>To determine whether untutored intuitions underpin laws, we didn’t include people who had college training in medicine or law.</p>
<p>Then we analyzed whether the participants’ intuitions matched the compensations established by law.</p>
<p></p>
<p><a href="https://doi.org/10.1126/sciadv.ads3688">Our findings</a> were striking. The values placed on body parts by both laypeople and lawmakers were largely consistent. The more highly American laypeople tended to value a given body part, the more valuable this body part seemed also to Indian laypeople, to American, Korean and Emirati lawmakers, to King Æthelberht and to the authors of the Guta lag. For example, laypeople and lawmakers across cultures and over centuries generally agree that the index finger is more valuable than the ring finger, and that one eye is more valuable than one ear.</p>
<p>But do people value body parts accurately, in a way that corresponds with their actual functionality? There are some hints that, yes, they do. For example, laypeople and lawmakers regard the loss of a single part as less severe than the loss of multiples of that part. In addition, laypeople and lawmakers regard the loss of a part as less severe than the loss of the whole; the loss of a thumb is less severe than the loss of a hand, and the loss of a hand is less severe than the loss of an arm.</p>
<p>Additional evidence of accuracy can be gleaned from ancient laws. For example, linguist Lisi Oliver notes that in Barbarian Europe, “wounds that may cause permanent incapacitation or disability <a href="https://utppublishing.com/doi/book/10.3138/9781487547707">are fined higher</a> than those which may eventually heal.”</p>
<p>Although people generally agree in valuing some body parts more than others, some <a href="https://search.worldcat.org/title/1593043">sensible differences may arise</a>. For instance, sight would be more important for someone making a living as a hunter than as a shaman. The local environment and culture might also play a role. For example, upper body strength could be particularly important in violent areas, where one needs to defend oneself against attacks. These differences remain to be investigated.</p>
<h2>Morality and law, across time and space</h2>
<p>Much of what counts as moral or immoral, legal or illegal, varies from place to place. Drinking alcohol, eating meat and cousin marriage, for example, have been variously condemned or favored in different times and places.</p>
<p>But <a href="https://doi.org/10.1038/s41562-020-0827-8">recent research</a> has also shown that, in some domains, there is much <a href="https://theconversation.com/intuitions-about-justice-are-a-consistent-part-of-human-nature-across-cultures-and-millennia-190523">more moral and legal consensus</a> about what is wrong, across cultures and even throughout the millennia. Wrongdoing – arson, theft, fraud, trespassing and disorderly conduct – appears to engender a morality and related laws that are similar across times and places. Laws about bodily damage also seem to fit into this category of moral or legal universals.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/246760/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/an-eye-for-an-eye-people-agree-about-the-values-of-body-parts-across-cultures-and-eras-246760">original article</a>.</em></p>
</div></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/children-fall-for-a-surprisingly-simple-numerical-illusion-and-it-grows-stronger-with-age/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Children fall for a surprisingly simple numerical illusion — and it grows stronger with age</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 19th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1111/desc.70032" target="_blank">Developmental Science</a></em> suggests that both children and adults are susceptible to a visual illusion that makes connected dots appear less numerous than unconnected ones. The illusion, known as the “connectedness illusion,” appears as early as age 5 and tends to grow stronger into adulthood. Interestingly, the individuals who show the most accurate visual number sense are also the most vulnerable to the illusion.</p>
<p>The findings provide insight into how our brains approximate number and offer evidence that the human visual system tends to operate on bounded objects when estimating quantity. This feature may reflect how the system is optimized for interpreting the environment but also introduces distortions under specific conditions.</p>
<p>“We often think of mathematics as the height of the human intellect – something that’s only done by smart, educated adults. However, lots of evidence has emerged, suggesting that even newborn infants have a basic sense of number,” explained study author <a href="https://sampclarke.net/" target="_blank">Sam Clarke</a>, an assistant professor of philosophy at the University of Southern California.</p>
<p>“For instance, young infants will reliably discriminate two collections of dots when they differ in number by a suitably large ratio, and notice the surprising coincidence when two collections happen to match in number. This suggests that our basic grasp of number might be innately hardwired, rather than learnt!”</p>
<p>“Of course, this isn’t uncontroversial,” Clarke explained. “One concern has been that when children or infants identify that one collection of dots is larger than another, they might not be tracking or responding to their number but rather some confound in the displays.” </p>
<p>“For instance, the total surface area of the dots on the screen. In fact, it’s really hard to rule these kinds of confounds out, because when you control for one – e.g. by matching the total surface area of the dots in both collections while continuing to vary their number – this requires that other differences in the displays are exacerbated – e.g. that the spatial density of the displays will now be more different. So, if kids continue to discriminate the displays, it’s possible that this still has nothing to do with number – they are now simply responding to the newly exacerbated differences.”</p>
<p>“This raises the question: How could we know that it’s number that children are tracking and representing in these kinds of experiment?” Clarke continued. “There are many ways of approaching this question. Our approach takes inspiration from the philosopher, Gottlob Frege. More than a hundred years ago, Frege noted that numbers are special kinds of quantity in that they have a ‘second-order character’; i.e. that in order to identify the number of items in a collection, you need to first decide how they are being carved up and individuated.” </p>
<p>“For instance, a single pile of cards might be thought of as one item if it’s the number of decks that we’re interested in counting, four items if it’s the number of suits that we’re interested in, or 52 items if it’s number of individual cards. What Frege noticed is that no other quantities seem to be like this. If you want to know how much the pile of cards weighs, or its volume, it doesn’t matter whether you think of it as a single deck, four suits, or 52 cards – when you measure the pile or chuck it on the scales the answer will be the same, regardless.</p>
<p>“In <a href="https://www.cambridge.org/core/journals/behavioral-and-brain-sciences/article/abs/number-sense-represents-rational-numbers/42F60E7CE8B5DF3AE507FA6793C6C985" target="_blank">an article from 2021</a>, co-authored with Jacob Beck, we noted that when people discriminate collections or estimate their quantity in the ways that young children and infants appear to, they seem to be specifically in the business of tracking and representing quantities with this second-order character that Frege noted is unique to number.” </p>
<p>“We gave a few reasons for this, but some of the clearest evidence for this came from a visual illusion, known as ‘the connectedness illusion,’ which was discovered concurrently by Lixia He and colleagues and Steven Franconerri and colleagues back in 2009. In the connectedness illusion, collections of dots that are connected with thin lines – effectively turning pairs of dots into single dumbbell shaped items – have their number or quantity systematically underestimated, even when participants explicitly try to ignore the connections and focus only on the dots. Indeed, you can see this for yourself.”</p>
<figure aria-describedby="caption-attachment-228542" class="wp-caption aligncenter"><a href="https://www.psypost.org/wp-content/uploads/2025/08/image.png"><img fetchpriority="high" decoding="async" src="https://www.psypost.org/wp-content/uploads/2025/08/image.png" alt="" width="936" height="520" class="size-full wp-image-228542" srcset="https://www.psypost.org/wp-content/uploads/2025/08/image.png 936w, https://www.psypost.org/wp-content/uploads/2025/08/image-300x167.png 300w, https://www.psypost.org/wp-content/uploads/2025/08/image-768x427.png 768w, https://www.psypost.org/wp-content/uploads/2025/08/image-750x417.png 750w" sizes="(max-width: 936px) 100vw, 936px"></a><figcaption class="wp-caption-text">Ignore the lines! Which panel contains more dots? The correct answer is the right-hand collection. But that’s not how things look. This is known as the Connectedness Illusion (He et al. 2009; Franconerri et al. 2009). Because introducing small breaks in the lines significantly reduces the effect, and because a display with small breaks barely differs from the display with fully connected lines with respect to non-numerical confounds such as total surface area and total brightness, this illusion provides strong evidence that the ANS tracks some kind of numerical quantity.</figcaption></figure>
<p>“To cut a long story short, this occurs because your visual system seems to take a stand on how items in the displays are to be carved up and individuated,” Clarke said. “In other words, your visual system seems to automatically try to keep track of the quantity of bounded objects in the display. Thus, by adding connecting lines (and thus more surface area to the collection) we reduce the apparent quantity of items in the display, because these connecting lines turn pairs of dots into single bounded objects.” </p>
<p>“This much isn’t inevitable – it could have been the case that your visual system would individuate the items differently, or not at all. Regardless, the fact that your visual system does take a stand on how the items should be carved up and individuated, coupled with the fact that this does seem to affect the perceived quantity of the collection in precisely the ways we would expect if it were tracking and representing a quantity with Frege’s second order character, seems to demonstrate that it really is some kind of number or numerical quantity that is being tracked and represented in these studies. After all, we know from Frege that non-numerical quantities just don’t work like this!</p>
<p>“With all of this in view, our study asked whether we might find evidence of the connectedness illusion, even in young children who have not learnt or been taught to count. Before our study, this is something that had not been tested.”</p>
<p>The researchers tested 43 children (aged 5 to 12) and 57 adults. Participants were recruited through a university child development lab and a university community, respectively. Children received gift cards for participating, while adults received course credit. All participants had normal or corrected-to-normal vision.</p>
<p>Participants completed an online task using visual arrays of blue dots. In some arrays, dots were connected into dumbbell-like pairs with straight lines. In others, the dots were either unconnected or arranged with lines that didn’t form connections. Across 450 trials, participants viewed two dot arrays side-by-side and were asked to judge which side had more dots, while being told to ignore the lines.</p>
<p>Before beginning, participants received simple, child-friendly instructions and practice trials. The researchers reminded them to ignore the connecting lines and to focus only on the number of blue dots. Each trial presented the arrays for a brief moment, followed by a response screen where participants selected the side they thought had more dots.</p>
<p>Clarke and his colleagues found that children as young as 5 years old showed signs of the connectedness illusion. They were more likely to judge an array of connected dots as having fewer items compared to an unconnected array, even when the total number of dots was the same. This suggests that the visual system at that age is already organizing visual input into discrete units—treating connected pairs as single entities.</p>
<p>The strength of the illusion increased with age. On average, children perceived unconnected arrays as about 3.4% more numerous than connected ones. In adults, this difference rose to about 7%. Statistical modeling confirmed that the illusion was significantly stronger in adults, and the trend held across the full sample—older participants tended to show stronger illusion effects.</p>
<p>“When we first designed the study, I hypothesized that the strength of the illusion might be stronger during early development,” Clarke told PsyPost. “In other words, I expected that young children might more susceptible to the illusion.” </p>
<p>“I’d hypothesized as much, because I’d been struck by the fact that while the connectedness illusion is really robust in adults, and can be readily appreciated just by looking at a figure, connecting pairs of dots doesn’t come anywhere close to halving their perceived number, which is what we might expect if the visual system was simply carving the display up into its bounded items and then enumerating these. I wondered if this was because adult participants were being asked to ignore the lines entirely, and were thereby able to suppress the effect to some extent by trying to attend to the dots independently of the lines.</p>
<p>“Alas, my prediction was not borne out – we found precisely the opposite of what I’d suspected. The illusion gets stronger with age! Indeed, we found that – across all age groups – people with better numerical acuity were more susceptible to the illusion. This suggests that, rather than thinking of the connectedness illusion as an unfortunate quirk of our visual makeup, it actually reflects the system’s optimal functioning.”</p>
<p>The researchers also found that individuals with sharper number discrimination skills—those better at identifying which array had more dots—also tended to be more susceptible to the connectedness illusion. This association remained even after controlling for age. This pattern suggests that susceptibility to the illusion is not a sign of faulty processing, but instead reflects an aspect of optimal functioning in the visual number system.</p>
<p>The findings offer support for theories that the visual system processes number by first organizing visual input into discrete, bounded objects. The illusion occurs because connecting two dots makes the brain treat them as one object, reducing the perceived count. This supports a “direct” model of number perception, in which number is extracted from clearly defined objects, rather than being estimated from continuous properties like area or density.</p>
<p>The illusion also challenges alternative models that suggest number is inferred indirectly from visual features such as total surface area or spatial density. Arrays with connected dots have the same total area and density as arrays without connections, yet they are perceived differently. This discrepancy indicates that number perception may not simply rely on these features but instead depends on how the visual system segments the scene into countable units.</p>
<p>“Our main takeaway was that a connectedness illusion is found in all of the age groups we tested, right down to children as young as five years old, but that the strength of the illusion varied across development,” Clarke said. “Perhaps surprisingly, younger children were less susceptible to the illusion than adults.”</p>
<p>But the study, like all research, includes some caveats to consider. The experiment was conducted online, which limited control over participants’ screen sizes and viewing conditions. Although steps were taken to ensure stimulus consistency across devices, variations in display could have introduced noise into the data. </p>
<p>“Our experimental paradigm didn’t allow us to test really young babies – thus, the youngest children tested in our study were five years old,” Clarke noted. “This is because it had to framed as a fun game, in which it could be explained to the participating children that they needed to try and ignore the connecting lines. So, while our results suggest that the connectedness illusion is a ubiquitous feature of our visual number sense, our results don’t enable us to confirm this.”</p>
<p>Future studies may investigate whether attention, working memory, or other cognitive factors contribute to individual differences in illusion strength. Researchers might also explore whether training can alter susceptibility to the illusion, or whether similar patterns appear in infants or in populations with neurodevelopmental differences.</p>
<p>“My broader interest is in understanding how children learn,” Clarke explained. “That might sound surprising given the emphasis on innate (unlearnt) numerical abilities in this study. But children need to have some innate mechanisms and abilities if, unlike rocks and many other things, they are to be capable of learning anything at all.” </p>
<p>“The hypothesis that I’m exploring is that infants are born with surprisingly rich innate numerical abilities, and that these enable them to learn in surprisingly sophisticated and targeted ways – something which researchers in AI might do well to take heed of. For instance, I’ve argued that they enable children to focus their attention of statistically surprising events and to keep track of how often grammatical rules are and aren’t violated when learning a language.”</p>
<p>The study, “<a href="https://doi.org/10.1111/desc.70032" target="_blank">Children’s Number Judgments Are Influenced by Connectedness</a>,” was authored by Sam Clarke, Chuyan Qu, Francesca Luzzi, and Elizabeth Brannon.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/healthy-diet-is-associated-with-better-cognitive-functioning-in-the-elderly/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Healthy diet is associated with better cognitive functioning in the elderly</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Aug 19th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A meta-analysis of studies exploring the links between diet quality and cognitive functioning in individuals aged 60 years and older revealed that those adhering to a healthy dietary pattern have 40% lower odds of suffering from cognitive dysfunction. The paper was published in <a href="https://doi.org/10.1016/j.gerinurse.2025.03.048"><em>Geriatric Nursing.</em></a></p>
<p>As people grow old, their cognition starts to change. During normal aging, cognitive processing speed gradually slows, working memory capacity decreases, and multitasking becomes more difficult. Vocabulary and general knowledge tend to remain stable or even improve with age, while skills that require rapid problem-solving may decline.</p>
<p>However, some people begin experiencing cognitive dysfunction as they age. Cognitive dysfunction in the elderly goes beyond normal age-related changes and includes significant impairments in memory, attention, language, or executive functions. Such dysfunction can result from neurodegenerative diseases like Alzheimer’s disease, vascular issues, or other medical conditions. Early signs often include frequent forgetfulness, confusion, and difficulty completing familiar tasks. These changes usually develop subtly and progress over years, depending on the underlying cause.</p>
<p>Study author Haoting Pei and his colleagues aimed to integrate findings from existing studies on the association between dietary patterns and cognitive function in older adults. Previous research has identified a variety of lifestyle factors linked to cognitive functioning in late life, including physical activity, cognitive stimulation, social engagement, and continuous learning. Maintaining a healthy diet has also been highlighted, but the strength of the association has remained unclear. The authors wanted to examine this link more systematically.</p>
<p>They searched the scientific databases MEDLINE, Scopus, PubMed, and Web of Science for studies reporting on dietary patterns and cognitive function among older adults. Eligible studies included participants aged 60 years or older, specified the dietary patterns being assessed, and used valid outcome measures of cognitive function.</p>
<p>The search resulted in 15 independent studies with a combined sample of more than 62,500 participants. Taken together, these studies indicated that older adults adhering to a healthy dietary pattern had 40% lower odds of experiencing cognitive dysfunction compared to their peers with less healthy diets. Although the results were highly heterogeneous across studies, the researchers found that no single study disproportionately influenced the overall findings.</p>
<p>“Therefore, in daily life, older adults should be encouraged to have a balanced intake of vegetables, fruits, fish, and legumes at each meal. However, whether some dietary components such as dairy products have beneficial effects on cognitive function in older adults is still controversial and further in-depth studies on them are needed in the future.”, study authors concluded.</p>
<p>A healthy dietary pattern in this study refers to diets shown in previous research to support overall health, such as the Mediterranean diet and the MIND diet. These patterns emphasize fruits, vegetables, whole grains, legumes, nuts, olive oil, moderate fish and poultry, and limited consumption of red meat, sweets, pastries, and fried foods.</p>
<p>The study provides evidence that healthy diets are linked to better cognitive outcomes in older adults. Still, the design of the included studies does not allow for definitive causal conclusions. While it is likely that healthy dietary patterns help protect cognitive health, it is also possible that older adults with stronger cognitive abilities are better able to access healthy foods and maintain beneficial dietary habits.</p>
<p>The paper, “<a href="https://doi.org/10.1016/j.gerinurse.2025.03.048">Association of dietary pattern and cognitive function in the elderly: A systematic review and meta-analysis</a>,” was authored by Haoting Pei, Sihan Liu, Longxin Li, and Min Zhou.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>