<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/narcissistic-personality-traits-appear-to-reduce-reproductive-success/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Narcissistic personality traits appear to reduce reproductive success</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 25th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A study conducted in Serbia found that individuals with higher levels of narcissism tend to have fewer children. These individuals also report stronger negative childbearing motivations—that is, reasons for not wanting children. This pattern was especially pronounced among those with higher levels of vulnerable narcissism. The study was published in <a href="https://link.springer.com/article/10.1007/s40806-024-00421-3"><em>Evolutionary Psychological Science</em></a>.</p>
<p>Narcissism is a personality trait characterized by self-centeredness, an inflated sense of self-importance, and difficulties with empathy. It is commonly divided into two main forms: grandiose narcissism and vulnerable narcissism. Grandiose narcissism involves overt self-confidence, dominance, entitlement, and a desire for admiration and power. People high in grandiose narcissism are often socially bold and charismatic, but they may also be exploitative and dismissive of others.</p>
<p>In contrast, vulnerable narcissism is marked by insecurity, sensitivity to criticism, social withdrawal, and fragile self-esteem. Individuals with this trait may appear modest or shy but often harbor internal feelings of superiority and resentment. While grandiose narcissists typically externalize blame and seek attention, vulnerable narcissists are more prone to anxiety and depression. Both forms share a core of self-centeredness but differ in how self-worth is maintained and how individuals relate to others.</p>
<p>Study authors Janko Međedović and Nikoleta Jovanov aimed to explore how narcissism relates to fertility, defined as the number of biological children an individual has. They also examined whether this relationship is mediated by childbearing motivations, romantic attachment styles, and relationship characteristics.</p>
<p>The sample included 953 adults residing in Serbia, 56% of whom were male. The average age of participants was approximately 35 years. The authors noted that participants were, on average, more highly educated than the general Serbian population. At the time of data collection, 59% of participants had no children, 19% had two, 18% had one, and 4% had three children.</p>
<p>Participants completed measures assessing narcissism (using the Pathological Narcissism Inventory), childbearing motivations (via the Childbearing Motivations Scale), and romantic attachment (through the Experiences in Close Relationships Inventory). They also reported the number of biological children they had, their number of sexual partners, satisfaction with their romantic relationships, the duration of their longest relationship, and their age at first childbirth (or desired age if they had no children).</p>
<p>The results showed that having more children was associated with longer romantic relationships, slightly more sexual partners, greater relationship satisfaction, stronger positive childbearing motivations, and less negative motivation to avoid parenthood. Individuals who had more secure attachment styles also tended to have more children.</p>
<p>By contrast, individuals with higher levels of narcissism—both grandiose and vulnerable—had fewer children on average. These individuals also reported stronger negative motivations for avoiding parenthood. This association was especially strong among those high in vulnerable narcissism.</p>
<p>Higher narcissism was also linked to less secure romantic attachment. Individuals with greater levels of narcissism tended to have shorter romantic relationships, and in the case of vulnerable narcissism, slightly lower satisfaction with their romantic relationships.</p>
<p>The study sheds light on the links between narcissism and fertility. However, it should be noted that the majority of study participants did not have any children, meaning that the variability of this indicator was very limited. Additionally, the design of the study does not allow any causal inferences to be derived from the result.</p>
<p>The paper, “<a href="https://doi.org/10.1007/s40806-024-00421-3">Explaining the Links Between Narcissism and Fertility: Are There Differences Between the Grandiose and Vulnerable Component?,</a>” was authored by Janko Međedović and Nikoleta Jovanov.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/sense-of-purpose-emerges-as-key-predictor-of-cognitive-functioning-in-older-adults/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Sense of purpose emerges as key predictor of cognitive functioning in older adults</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 25th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A large, decade-long study of older adults has found that those with higher levels of well-being—especially a strong sense of purpose—also tended to show better cognitive functioning and less cognitive decline over time. The study, published in <em><a href="https://journals.sagepub.com/doi/abs/10.1177/09567976251335578" target="_blank" rel="noopener">Psychological Science</a></em>, suggests that well-being and cognitive health are closely linked in later life, with each influencing the other in a dynamic and reciprocal relationship.</p>
<p>As the global population ages, the burden of cognitive decline and dementia is expected to increase dramatically. Efforts to identify modifiable factors that can help maintain cognitive function into older age have become a top priority. One area of growing interest is the role of psychological well-being. While earlier research has shown links between well-being and cognition, most studies could not determine whether changes in well-being led to changes in cognition or the reverse.</p>
<p>The current study, led by <a href="https://sites.google.com/view/gabriellepfund/" target="_blank" rel="noopener">Gabrielle Pfund</a> of Auburn University and colleagues from Rush University and Washington University in St. Louis, aimed to clarify these relationships by using advanced statistical techniques to track both well-being and cognitive function in the same individuals over time.</p>
<p>“I have been interested in the construct of sense of purpose since early in my graduate school experience,” said Pfund, an assistant professor of human development and family science. “As I continued throughout graduate school, I came across more and more research that highlighted the predictive power of purpose, particularly for healthy aging. Like many, I’ve personally experienced the pain of losing a loved one to dementia. With pharmacological interventions still nascent, establishing nonpharmacological opportunities to combat the development of dementia and slow cognitive decline is imperative.”</p>
<p>The research team analyzed data from 1,702 adults over the age of 65 who participated in two large studies based in the Chicago area: the Memory and Aging Project and the Minority Aging Research Study. Participants were racially diverse, with about three-quarters identifying as White and nearly a quarter identifying as Black.</p>
<p>Each participant completed annual assessments of their cognitive abilities and self-reported levels of well-being for up to 10 years. Cognitive function was measured using a comprehensive battery of 19 tests covering memory, processing speed, verbal ability, and spatial reasoning. Well-being was assessed using several measures, including the Psychological Well-Being Scale (which reflects eudaimonic well-being, such as autonomy and personal growth), a separate measure of sense of purpose, and the Satisfaction With Life Scale.</p>
<p>“This study focuses on the reciprocal relationship between cognitive function with three domains of well-being: (1) sense of purpose (the extent to which one feels they have personally meaningful goals and activities), (2) eudaimonic well-being (one’s sense of autonomy, personal growth, purpose, and connection to others), and (3) life satisfaction (one’s sense of contentment with their life),” Pfund explained.</p>
<p>To explore the long-term patterns between well-being and cognition, the researchers used bivariate latent growth curve models. These models showed that people who started out with higher levels of well-being also tended to have higher levels of cognitive function. More importantly, participants who experienced steeper declines in well-being also tended to show steeper cognitive declines. These findings held even after accounting for factors such as age and sex.</p>
<p>The researchers also applied a second type of statistical model, known as random-intercept cross-lagged panel models, which allowed them to examine how changes in one variable predicted changes in the other over time. These analyses revealed that changes in well-being predicted subsequent changes in cognition—and vice versa. In other words, having a better-than-usual year in terms of well-being was followed by a better-than-usual year in cognitive performance. Similarly, declines in cognition predicted future declines in well-being.</p>
<p>“We found that as cognition function declined, so did participants’ levels of well-being,” Pfund told PsyPost. “We also found that declines in well-being at one time point predicted times in cognitive function at the next (and vice versa). These patterns remained when accounting for age, sex, race, APoE genotype (i.e., gene for Alzheimer’s risk), education, depressive symptoms, and neuroticism. This means that intervention efforts focused on the development, maintenance, and promotion of well-being could be a promising pathway to support healthy cognitive aging.”</p>
<p>The researchers found that these effects were not equally strong across all forms of well-being. Eudaimonic well-being and sense of purpose had a stronger and more consistent relationship with cognitive health than life satisfaction did. People with a strong sense of purpose were more likely to maintain their cognitive abilities, and cognitive declines had a more pronounced impact on this form of well-being than on general life satisfaction.</p>
<p>“One of my main interests in the current work was to see whether these different elements of well-being were all associated with cognitive decline equally,” Pfund said. “I expected the findings to be stronger for sense of purpose than life satisfaction, and that ended up being the case. This means this finding was not a surprise for me, but sometimes psychologists outside of the well-being research field consider these different domains of well-being consider them fairly synonymous, so I think this finding will likely be a surprise to others.”</p>
<p>The study’s large sample size, long follow-up period, and racially diverse cohort strengthen the reliability of the findings. The researchers also took steps to control for many factors that might influence the results. However, as with any observational study, the results cannot prove causality. It is still possible that other unmeasured factors influence both well-being and cognition, or that some of the observed links are due to shared lifestyle or health-related variables.</p>
<p>The authors also note that their findings may not apply equally across all populations. For instance, participants were primarily drawn from an urban region with relatively high access to health care and community resources.</p>
<p>“Overall, the current study was quite well-suited to the current question given the large number of assessments and racially diverse sample,” Pfund said. “However, one question I have growing interest in is whether these findings would extend to more rural populations, who are at higher risk for cognitive decline, its risk factors (e.g., diabetes), and protective factors (e.g., access to medical care). Understanding whether these associations are consistent, stronger, or weaker in higher risk populations with less access to other medical support provides context for the generalizability of these findings and future intervention efforts.”</p>
<p>“My main goal is to understand daily processes and mechanisms that link these long-term relationships,” Pfund continued. “Why is sense of purpose associated with cognitive function? Is it because purposeful people have better social relationships, engage in more cognitive activities, are more physically active, or is there something unique about feeling purposeful? This approach is necessary for establishing what factors are most necessary to intervene upon in daily life to promote healthy cognitive aging.”</p>
<p>The study, “<a href="https://doi.org/10.1177/09567976251335578" target="_blank" rel="noopener">Bidirectional Relationships Between Well-Being and Cognitive Function</a>,” was authored by Gabrielle N. Pfund, Bryan D. James, and Emily C. Willroth.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/psilocybin-and-escitalopram-produce-antidepressant-effects-via-distinct-brain-mechanisms-study-suggests/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Psilocybin and escitalopram produce antidepressant effects via distinct brain mechanisms, study suggests</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 25th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in the <em><a href="https://psychiatryonline.org/doi/10.1176/appi.ajp.20230751" target="_blank" rel="noopener">American Journal of Psychiatry</a></em> sheds light on how psilocybin and the antidepressant escitalopram affect brain responses to emotional stimuli in people with major depressive disorder. While both treatments led to improvements in mood, only escitalopram was linked to a reduction in brain activity in response to emotional faces. Psilocybin, on the other hand, appeared to preserve or slightly enhance emotional responsiveness, especially to neutral expressions.</p>
<p>Psilocybin is a naturally occurring psychedelic compound found in certain mushrooms. In recent years, it has attracted growing attention as a potential treatment for depression, particularly for individuals who do not respond to standard antidepressants. Unlike traditional medications, which typically take several weeks to show results, psilocybin often leads to rapid and lasting improvements after just one or two sessions, especially when paired with psychological support.</p>
<p>Traditional antidepressants like escitalopram work by increasing levels of serotonin, a brain chemical involved in mood regulation. However, these drugs are often only moderately effective and may come with side effects such as emotional blunting and sexual dysfunction. Emotional blunting refers to a general reduction in emotional intensity or responsiveness and has been reported by nearly half of patients taking selective serotonin reuptake inhibitors (SSRIs).</p>
<p>The researchers behind this study wanted to find out whether these two treatments—psilocybin and escitalopram—have different effects on how the brain responds to emotional information, especially facial expressions, which are central to social and emotional communication.</p>
<p>“This is part of a long <a href="https://www.imperial.ac.uk/psychedelic-research-centre/" target="_blank" rel="noopener">program of research at Imperial College London</a>, which I’ve been very lucky and honored to be a part of, which is investigating the effects of psychedelic therapy. The inspiration for this particular study was to compare psychedelic therapy with a standard depression treatment (escitalopram) and investigate the possible mechanisms behind both,” explained study author <a href="https://scholar.google.com/citations?user=1qhqgiYAAAAJ&hl=en" target="_blank" rel="noopener">Matt Wall</a>, the director of translational MRI at <a href="https://www.perceptive.com/" target="_blank" rel="noopener">Perceptive Inc.</a> and a <a href="https://profiles.imperial.ac.uk/matthew.wall" target="_blank" rel="noopener">honorary senior lecturer</a> at Imperial College London.</p>
<p>To investigate this, the researchers conducted a randomized, double-blind clinical trial comparing psilocybin-assisted therapy with a standard course of escitalopram. The study included 46 individuals with moderate to severe depression. Twenty-five participants received two high-dose psilocybin sessions spaced three weeks apart, along with daily placebo capsules. Twenty-one others received daily escitalopram, beginning with 10 milligrams per day for the first three weeks and increasing to 20 milligrams for the final three weeks, along with two placebo psilocybin sessions involving a nonpsychoactive 1-milligram dose. All participants received the same psychological support throughout the trial.</p>
<p>Before starting treatment and again six weeks later, participants underwent functional MRI scans while viewing emotional facial expressions—fearful, happy, and neutral. This allowed the researchers to observe how the brain’s response to emotional information changed over time and differed between the two treatment groups.</p>
<p>The results showed clear differences between the groups. In the escitalopram group, brain activity in response to emotional faces decreased across a wide range of cortical regions after treatment. The amygdala, a brain structure involved in processing emotions like fear, also showed reduced responses to fearful faces in this group. These findings are consistent with earlier studies linking SSRIs to dampened emotional responses, which may help patients feel less overwhelmed by negative emotions but can also blunt positive feelings.</p>
<p>In contrast, psilocybin did not lead to the same kind of neural dampening. In fact, responses to emotional faces were mostly unchanged or slightly increased, especially in response to neutral expressions. Activity in the amygdala showed minimal change following psilocybin treatment. The differences in brain responses between the two groups were significant and involved areas linked to attention, emotion, and social understanding.</p>
<p>One possible reason for the absence of strong neural changes in the psilocybin group may be the timing of the post-treatment scans. “Some of our previous work actually showed an increase in emotional brain function after psychedelic therapy, which we didn’t see this time; we think this is probably due to the timings of the after-treatment scans in this study which was quite a long time after the psychedelic therapy sessions (three weeks),” Wall told PsyPost. “These effects we saw before may have worn off by then.”</p>
<p>Despite these neural differences, both treatments were effective in reducing depressive symptoms. Participants in the psilocybin group reported larger improvements in well-being, pleasure, and emotional intensity compared to those in the escitalopram group. Notably, those treated with escitalopram showed greater reductions in emotional intensity, consistent with emotional blunting. Sexual function scores also declined more in the escitalopram group.</p>
<p>Additional analyses explored how changes in brain activity and emotional function were related to clinical outcomes. In the escitalopram group, reductions in brain responses to emotional stimuli were associated with improved mood—particularly among participants who also experienced emotional blunting. In contrast, in the psilocybin group, improvements in depression were more closely tied to increases in emotional intensity rather than changes in brain activation. This suggests that while SSRIs may work by dampening emotional reactivity, psilocybin may help people reconnect with their emotions.</p>
<p>These findings indicate “that psychedelic therapy and escitalopram were both effective treatments for depression, but the mechanisms behind how they produce these clinical effects may be quite different,” Wall said. “Escitalopram seems to reduce brain responses to all emotions, and we think this may relate to a common side effect of standard antidepressants called emotional blunting where patients report that they feel emotionally flat or numb. On the other hand, psychedelic therapy was as good as (or perhaps even better) at treating depression symptoms, but without the change in emotional brain function. Psychedelic therapy may be a good antidepressant treatment for patients who find the emotional blunting side-effects of standard antidepressants problematic.”</p>
<p>But there are some caveats to consider. Although the study was randomized and blinded, the strong effects of psilocybin likely made it easy for participants to guess their treatment group.</p>
<p>“It’s still a pretty small study with roughly 20 subjects in each group, so bigger studies of this type would be useful,” Wall noted. “Also, there are the standard issues with psychedelic research where it’s very difficult to keep subjects blinded to which treatment they’re having; it’s usually very clear to the patient if they’re in the psychedelic treatment group!”</p>
<p>Future research with larger samples and different timing may help clarify how psilocybin influences emotional brain function over time. Studies that include more diverse populations, varied imaging tasks, and direct measures of brain plasticity could also offer new clues about how this therapy works.</p>
<p>“Long-term the aim is really to have psychedelic therapy available as a mainstream treatment in a range of possible psychiatric disorders, but unfortunately we’re still quite a long way from that goal, and there a huge number of scientific, legal, and political hurdles to get over on the way,” Wall said.</p>
<p>The study, “<a href="https://doi.org/10.1176/appi.ajp.20230751" target="_blank" rel="noopener">Reduced Brain Responsiveness to Emotional Stimuli With Escitalopram But Not Psilocybin Therapy for Depression</a>,” was authored by Matthew B. Wall, Lysia Demetriou, Bruna Giribaldi, Leor Roseman, Natalie Ertl, David Erritzoe, David J. Nutt, and Robin L. Carhart-Harris.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/psychology-study-sheds-light-on-why-some-moments-seem-to-fly-by/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Psychology study sheds light on why some moments seem to fly by</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 24th 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Many people feel that life speeds up as they get older. But what actually causes this sensation? A new study published in <em><a href="https://journals.sagepub.com/doi/10.1177/01461672241285270" target="_blank" rel="noopener">Personality and Social Psychology Bulletin</a></em> sheds light on the reasons people perceive past periods of their lives as having passed quickly. Across four studies, researchers found that the feeling that time has flown by is more strongly associated with personal growth, satisfaction, and nostalgia than with daily routines or a lack of meaningful experiences.</p>
<p>Across cultures and age groups, people consistently report that time seems to pass faster as they grow older. This perception isn’t just a curiosity—it can influence life satisfaction, existential anxiety, and engagement with long-term goals. Despite its impact, the underlying psychological mechanisms remain poorly understood.</p>
<p>Historically, a common explanation is the <em>routine-compression</em> theory: when life is filled with repetitive tasks and few novel events, people form fewer unique memories. In retrospect, the time feels empty or collapsed, and thus shorter. An alternative hypothesis, based on self-determination theory, suggests that when people feel they did not experience personal growth, they judge the time as wasted and brief.</p>
<p>The researchers behind the new study sought to evaluate both theories and explore new possibilities. They wondered whether the perception that time flew by might actually be driven by how satisfied people felt with that period—and whether nostalgia could also play a role.</p>
<p>Across four studies involving nearly 2,500 participants in total, the researchers used surveys to examine how people recalled different time periods—such as the past year, a college semester, or the summer. They assessed participants’ memories of how routine or varied the period was, how much personal growth they felt they experienced, how many events stood out, and how satisfying or nostalgic the period felt.</p>
<p>In the first two studies, participants included university students reflecting on an academic year and a broader adult sample recruited online reflecting on their past year. In the final two studies, the team gathered responses from another online adult sample and from a new group of college students, this time reflecting on the past summer.</p>
<p>Participants responded to questions on a range of topics. These included how repetitive their days felt, how much they grew in terms of autonomy and competence, how many events they could recall, how satisfied they were with the period, and how nostalgic they felt about it. They also rated how fast that period felt in hindsight using slider scales.</p>
<p>The traditional routine-compression account received weak support. In two of the four studies, people who remembered a period as more routine also felt it had passed more quickly. But routine did not consistently predict how many events participants remembered. And even when it did, fewer events were not linked to slower time perception—in fact, sometimes the opposite was true.</p>
<p>The self-determination-based growth-deprivation hypothesis also did not hold up. Rather than feeling that time flew when they lacked growth, participants were more likely to say time sped by during periods when they <em>did</em> feel they were growing. This unexpected pattern led the researchers to reconsider their framework.</p>
<p>They proposed two new explanations. One was the <em>growth-immersion</em> account. According to this view, people feel that time passed quickly when they were deeply immersed in meaningful, challenging activities that supported their personal development. The satisfaction that came from those activities might have made them less aware of the passage of time—similar to how people lose track of time during a “flow” state.</p>
<p>The second was the <em>growth-longing</em> account. Here, the idea is that looking back on a time of growth may trigger a sense of nostalgic longing. The period might feel special and emotionally significant, but also fleeting, because it stands out as a high point in one’s personal development.</p>
<p>To test these new ideas, the researchers looked at whether feelings of satisfaction and nostalgia explained the link between remembered growth and time perception. Across two pre-registered studies, they found that both satisfaction and nostalgia predicted faster perceived time passage. And when these factors were included in statistical models, the direct effect of growth on perceived speed disappeared. This suggests that growth may influence time perception indirectly, by increasing satisfaction and nostalgia.</p>
<p>Between the two new explanations, satisfaction appeared to be a slightly stronger predictor of how fast time seemed to pass. Still, both played significant roles.</p>
<p>The studies relied on self-reports and cross-sectional data, which means they can’t definitively establish causation. It’s also possible that people who generally feel more nostalgic or satisfied with life are more likely to perceive time as passing quickly, regardless of actual events. Future research could use longitudinal or experimental methods to better understand how growth, satisfaction, and nostalgia shape time perception.</p>
<p>Age and lifestyle may also moderate how routine or growth is experienced. The routine-compression effect was more evident in older, more demographically diverse samples than in younger college students. This suggests that the meaning of “routine” may shift with age or life circumstances.</p>
<p>Finally, the researchers noted that people might be remembering different <em>types</em> of events when asked to recall memorable experiences. Their studies did not distinguish between ordinary daily events and major life milestones, which might have different effects on time perception.</p>
<p>The feeling that life flies by may not be a symptom of monotony or wasted time, as some theories suggest. Instead, it may reflect the opposite: periods filled with growth, satisfaction, and rich emotional meaning tend to feel shorter in hindsight. Rather than worrying that time is slipping away too fast, people might reframe that experience as a sign that the period was meaningful and well-lived.</p>
<p>This shift in perspective could help individuals come to terms with the pace of life and may even encourage them to seek out more self-determined, fulfilling experiences. As the authors suggest, perhaps the goal shouldn’t be to slow life down—but to make it rich enough that its swift passage feels worthwhile.</p>
<p>“Time appeared to pass swiftly when it followed a repetitive pattern of routine, but critically, when it also was fulfilling, evoking a sense of satisfaction and immersion, and perhaps to a lesser extent, nostalgia and longing,” the researchers concluded. “The growth-immersion and growth-longing mechanisms converge to suggest that time flies because you got a lot out of it. This conclusion points to a novel intervention to encourage people to reappraise life’s apparent acceleration as a sign of a meaningful life. In other words, perhaps the goal should not be to ‘slow life down’ at all, but rather to encourage people to interpret the feeling that a period zipped by as a sign that that period was well lived. Life goes fast, we say, but you wouldn’t want it any other way.”</p>
<p>The study, “<a href="https://doi.org/10.1177/01461672241285270" target="_blank" rel="noopener">Why Life Moves Fast: Exploring the Mechanisms Behind Autobiographical Time Perception</a>,” was authored by Young-Ju Ryu, Mark J. Landau, Samuel E. Arnold, and Jamie Arndt.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-study-links-depression-to-accelerated-brain-aging/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New study links depression to accelerated brain aging</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 24th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <a href="https://doi.org/10.1017/S0033291725000418"><em>Psychological Medicine</em></a> has found that individuals with major depressive disorder have brains that appear significantly older than their actual age, underscoring the connection between mental health and brain aging.</p>
<p>Recent scientific advances have begun to clarify how depression not only influences mood but also affects the brain’s physical structure. While aging is a natural process, growing evidence suggests that depression may accelerate some aspects of brain aging. However, much of this earlier research focused primarily on Western populations.</p>
<p>To address this gap, the new study analyzed brain scans from a Japanese cohort, aiming to determine whether the brains of individuals with major depressive disorder appear older than those of healthy individuals.</p>
<p>Led by Ruibin Zhang of Southern Medical University in China, the research team sought to investigate the biological factors underlying brain aging. They were particularly interested in how structural brain changes might be linked to alterations in key neurotransmitters and patterns of gene expression.</p>
<p>The study analyzed data from 670 participants, including 239 individuals with major depressive disorder and 431 healthy controls, collected from multiple sites in Japan. Using advanced brain imaging techniques, the researchers measured the thickness of various brain regions. They then applied a machine learning approach to analyze the images and calculate a “brain age” that reflected the extent of structural change.</p>
<p>The findings were striking. People with major depressive disorder had brains that appeared significantly older than those of their healthy peers. Specific areas of the brain—namely parts of the left ventral region and the premotor eye field—showed pronounced cortical thinning.</p>
<p>“These regions are primarily associated with higher-order cognitive functions, including attention, working memory, reasoning, and inhibition,” Zhang and colleagues explained.</p>
<p>The researchers also found that the areas with the greatest thinning were associated with changes in neurotransmitter systems—specifically, those involving dopamine, serotonin, and glutamate. These neurotransmitters play vital roles in mood regulation and cognitive processes, and their altered expression in individuals with depression suggests that biochemical disruptions may contribute to accelerated brain aging.</p>
<p>In addition, the team examined gene expression patterns and found that several genes involved in protein binding and processing were more active in the regions showing cortical thinning. These genes are essential for maintaining healthy cell structure and function. Disruptions in these pathways may lead to tissue degradation and contribute to faster brain aging in individuals with depression.</p>
<p>While the findings are compelling, the authors acknowledged several limitations. Most notably, the study was cross-sectional, meaning it captured data at a single point in time. Because brain aging is a gradual process, longitudinal studies are needed to understand how the frequency and severity of depression influence brain aging over time.</p>
<p>The study, “<a href="https://doi.org/10.1017/S0033291725000418">Accelerated brain aging in patients with major depressive disorder and its neurogenetic basis: evidence from neurotransmitters and gene expression profiles</a>,” was authored by Haowei Dai, Lijing Niu, Lanxin Peng, Qian Li, Jiayuan Zhang, Keyin Chen, Xingqin Wang, Ruiwang Huang, Tatia M.C. Lee, and Ruibin Zhang.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/using-tech-in-later-life-may-protect-against-cognitive-decline-study-suggests/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Using tech in later life may protect against cognitive decline, study suggests</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 24th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>In the 21st century, digital technology has changed many aspects of our lives. Generative artificial intelligence (AI) is the latest newcomer, with chatbots and other AI tools changing <a href="https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2023.1181712/full">how we learn</a> and creating considerable <a href="https://royalsocietypublishing.org/doi/10.1098/rsos.240197">philosophical and legal challenges</a> regarding what it means to “outsource thinking”.</p>
<p>But the emergence of technology that changes the way we live is not a new issue. The change from analogue to digital technology began around the 1960s and this “<a href="https://www.sciencedirect.com/topics/psychology/digital-revolution#:~:text=Explained%20by%20the%20online%20encyclopedia,and%20proliferation%20of%20digital%20computers">digital revolution</a>” is what brought us the internet. An entire generation of people who lived and worked through this evolution are now entering their early 80s.</p>
<p>So what can we learn from them about the impact of technology on the ageing brain? A comprehensive <a href="https://doi.org/10.1038/s41562-025-02159-9">new study</a> from researchers at the University of Texas and Baylor University in the United States provides important answers.</p>
<p>Published in <em>Nature Human Behaviour</em>, it found no supporting evidence for the “digital dementia” hypothesis. In fact, it found the use of computers, smartphones and the internet among people over 50 might actually be associated with lower rates of cognitive decline.</p>
<h2>What is ‘digital dementia’?</h2>
<p>Much has been written about the potential <a href="https://www.cambridge.org/core/journals/memory-mind-and-media/article/media-technology-and-the-sins-of-memory/4F169E671DFA95639E971B43B5E4D57A">negative impact from technology on the human brain</a>.</p>
<p>According to the <a href="https://www.imrpress.com/journal/jin/21/1/10.31083/j.jin2101028">“digital dementia” hypothesis</a> introduced by German neuroscientist and psychiatrist <a href="https://www.amazon.com.au/dp/3426276038?ref_=mr_referred_us_au_au">Manfred Spitzer</a> in 2012, increased use of digital devices has resulted in an over-reliance on technology. In turn, this has weakened our overall cognitive ability.</p>
<p>Three areas of concern regarding the use of technology have previously been noted:</p>
<ol>
<li>An increase in <a href="https://www.frontiersin.org/journals/education/articles/10.3389/feduc.2021.600687/full">passive screen time</a>. This refers to technology use which does not require significant thought or participation, such as watching TV or scrolling social media.</li>
<li><a href="https://journals.sagepub.com/doi/full/10.1177/17470218211008060">Offloading cognitive abilities</a> to technology, such as no longer memorising phone numbers because they are kept in our contact list.</li>
<li>Increased <a href="https://www.nature.com/articles/s41598-023-36256-4">susceptibility to distraction</a>.</li>
</ol>
<h2>Why is this new study important?</h2>
<p>We know technology can impact how our brain <a href="https://onlinelibrary.wiley.com/doi/full/10.1002/hbm.24286?casa_token=982zQ5d6qNoAAAAA%3ALwtDMOIwyaXWJVj-NuiT9_JVhXbWtytWOu5saKJE9xsbPzlisGxdE7-gLnWcvQthoHQvXZX_NbINyE8">develops</a>. But the effect of technology on how our brain <em>ages</em> is less understood.</p>
<p>This new study by <a href="https://psychology.org.au/psychology/about-psychology/types-of-psychologists/clinical-neuropsychologists">neuropsychologists</a> Jared Benge and Michael Scullin is important because it examines the impact of technology on older people who have experienced significant changes in the way they use technology across their life.</p>
<p>The new study performed what is known as a <a href="https://training.cochrane.org/handbook/current/chapter-10">meta-analysis</a> where the results of many previous studies are combined. The authors searched for studies examining technology use in people aged over 50 and examined the association with cognitive decline or dementia. They found 57 studies which included data from more than 411,000 adults. The included studies measured cognitive decline based on lower performance on cognitive tests or a diagnosis of dementia.</p>
<h2>A reduced risk of cognitive decline</h2>
<p>Overall, the study found greater use of technology was associated with a reduced risk of cognitive decline. <a href="https://www.ncbi.nlm.nih.gov/books/NBK431098/">Statistical tests</a> were used to determine the “odds” of having cognitive decline based on exposure to technology. An odds ratio under 1 indicates a reduced risk from exposure and the combined odds ratio in this study was 0.42. This means higher use of technology was associated with a 58% risk reduction for cognitive decline.</p>
<p>This benefit was found even when the effect of <a href="https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(24)01296-0/abstract">other things</a> known to contribute to cognitive decline, such as socioeconomic status and other health factors, were accounted for.</p>
<p>Interestingly, the magnitude of the effect of technology use on brain function found in this study was similar or stronger than other known protective factors, such as physical activity (approximately a 35% risk reduction), or maintaining a healthy blood pressure (approximately a 13% risk reduction).</p>
<p>However, it is important to understand that there are <a href="https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(24)01296-0/abstract">far more studies</a> conducted over many years examining the benefits of managing <a href="https://www.frontiersin.org/journals/neurology/articles/10.3389/fneur.2022.821135/full">blood pressure</a> and increasing <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC10828294/">physical activty</a>, and the mechanisms through which they help protect our brains are far more understood.</p>
<p>It is also a lot easier to measure blood pressure than it is use of technology. A strength of this study is that it considered these difficulties by focusing on certain aspects of technology use but excluded others such as brain training games.</p>
<p>These findings are encouraging. But we still can’t say technology use <em>causes</em> better cognitive function. More research is needed to see if these findings are replicated in different groups of people (especially those from <a href="https://www.thelancet.com/journals/langlo/article/PIIS2214-109X(20)30062-0/fulltext">low and middle income countries</a>) who were underrepresented in this study, and to understand why this relationship might occur.</p>
<h2>A question of ‘how’ we use technology</h2>
<p>In reality, it’s simply not feasible to live in the world today without using some form of technology. Everything from paying bills to booking our next holiday is now almost completely done online. Maybe we should instead be thinking about <em>how</em> we use technology.</p>
<p><a href="https://www.thelancet.com/journals/lanpub/article/PIIS2468-2667(20)30284-X/fulltext">Cognitively stimulating activities</a> such as reading, learning a new language and playing music – particularly in early adulthood – can help protect our brains as we age.</p>
<p>Greater engagement with technology across our lifespan may be a form of stimulating our memory and thinking, as we adapt to new software updates or learn how to use a new smartphone. It has been suggested this “<a href="https://www.sciencedirect.com/science/article/pii/S0167494322002643?casa_token=-z-X7mF4Ar0AAAAA:X2UXk92rbfa8uXdJFltbUhBonZqRl4b2dTaJyZdKogQiPXR9b6maghPnZll5VQwoVVL6_3uW#bib0032">technological reserve</a>” may be good for our brains.</p>
<p>Technology may also help us to stay <a href="https://aging.jmir.org/2022/4/e40125/">socially connected</a>, and help us stay <a href="https://link.springer.com/article/10.1186/s40985-020-00143-4">independent for longer</a>.</p>
<h2>A rapidly changing digital world</h2>
<p>While findings from this study show it’s unlikely all digital technology is bad for us, the way we interact and rely on it is rapidly changing</p>
<p>The impact of AI on the ageing brain will only become evident in future decades. However, our ability to adapt to historical technological innovations, and the potential for this to support cognitive function, suggests the future may not be all bad.</p>
<p>For example, advances in <a href="https://www.mdpi.com/2076-3425/11/1/43">brain-computer interfaces</a> offer new hope for those experiencing the impact of neurological disease or disability.</p>
<p>However, the potential downsides of technology are real, particularly for younger people, including <a href="https://www.nature.com/articles/s44159-024-00307-y">poor mental health</a>. Future research will help determine how we can capture the benefits of technology while limiting the potential for harm.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/254392/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/new-study-finds-no-evidence-technology-causes-digital-dementia-in-older-people-254392">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/fathers-with-more-dominant-looking-faces-are-more-likely-to-have-sons/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Fathers with more dominant-looking faces are more likely to have sons</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">May 24th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://link.springer.com/article/10.1007/s40750-024-00254-1" target="_blank" rel="noopener">Adaptive Human Behavior and Physiology</a></em> suggests that men with more dominant-looking faces may be more likely to have sons as their first-born children. The researchers did not find similar effects for women, nor did they find that self-reported dominance or specific facial measurements like facial width-to-height ratio could predict a child’s sex. These findings add to ongoing debates about how traits related to dominance and mate selection might be linked to biological processes underlying sex determination.</p>
<p>The study was designed to explore whether dominance-related traits in parents—especially physical cues of dominance like facial appearance—are associated with the sex of their first child. The research draws on evolutionary theories such as the Trivers-Willard hypothesis, which predicts that parents in better condition might have more sons, and the maternal dominance hypothesis, which proposes that more dominant women may be biologically predisposed to have male offspring. Despite decades of research, evidence for these theories has remained inconsistent. This study sought to add clarity by focusing on one specific aspect of parental condition: dominance, and how it is perceived in the face.</p>
<p>“I suspect that for many people, observations in everyday life suggest that the characteristics of parents influence whether they have a son or daughter,” said study author <a href="https://benjaminjzubaly.github.io/" target="_blank" rel="noopener">Benjamin Zubaly</a>, an incoming PhD student at the University of Michigan. “However, my own intuitions are that the sex of one’s offspring is completely random. I find hypotheses from evolutionary psychology that predict the sex of one’s offspring to be interesting because they allow us to critically test these intuitions.”</p>
<p>The researchers recruited heterosexual couples with at least one child through the platform Prolific, gathering a final sample of 104 parent pairs. To participate, both partners needed to complete surveys measuring their self-perceived dominance and submit facial photographs. These photos were later rated for perceived dominance, attractiveness, and masculinity or femininity by university students. The research team also measured the facial width-to-height ratio, a feature previously proposed as a cue of dominance. For all analyses, the key outcome was whether the couple’s first-born child was male or female.</p>
<p>To assess psychological dominance, the study used three well-established tools. These included a checklist of dominant traits, a dominance subscale from a broader personality inventory, and a scale that measures dominance based on control in social contexts. Participants also submitted facial photos taken under standardized conditions. These images were analyzed both for their physical features and for how dominant they appeared to outside raters. Ratings from students showed high consistency, allowing the researchers to create a reliable index of facial dominance.</p>
<p>The central finding was that fathers whose faces were rated as more dominant were more likely to have a first-born son. This result held even after controlling for facial attractiveness, masculinity, and age. For fathers with neutral facial expressions in their photos, each standard deviation increase in perceived dominance was associated with an 83% greater chance of having a son.</p>
<p>The researchers did not observe similar effects for mothers’ facial dominance. Self-reported dominance did not predict child sex, and facial width-to-height ratio—often proposed as a proxy for dominance—also showed no reliable link to whether a first-born child was male or female.</p>
<p>These findings suggest that facial dominance, specifically in fathers, may relate in some way to biological or behavioral processes that influence offspring sex. One interpretation is that when women have higher testosterone levels around the time of conception—a factor linked to having male children—they may prefer more dominant-looking male partners. This preference, in turn, could influence the likelihood of having a son. Although speculative, this idea aligns with previous work showing that women who believe they will have male children also tend to find dominant-looking men more attractive.</p>
<p>“In our sample of romantic couples, we found that fathers with more dominant-looking faces were more likely to have sons for a first-born child. It is possible that this means when women are higher in testosterone and more likely to have a son they tend to choose more dominant males. However, further research is necessary to understand what processes underlie our findings.”</p>
<p>The study also builds on earlier research by Palmer-Hague and Watson, who found that parental facial dominance—especially when high in both partners—was associated with having a son. However, the current study did not find significant interactions between mothers’ and fathers’ dominance, suggesting that the father’s appearance alone may be more relevant. Notably, while Palmer-Hague and Watson found similar effects for self-reported dominance, the current study did not. The authors suggest this may be due to differences in how people perceive their own dominance versus how others judge it from physical appearance.</p>
<p>“While we find that an aspect of fathers’ dominance was related to offspring sex, other studies have found that mothers’ dominance is also important. Further research can help to confirm whether it is fathers’ dominance, mothers’ dominance, or both that influence offspring sex.”</p>
<p>As with any study, there are limitations to consider. The researchers relied on retrospective data, meaning that parents were asked about their children several years after birth. While statistical controls suggested that the time since birth did not bias the results, future studies should aim to collect data closer to the time of conception. Additionally, although the sample was cross-cultural, self-reported measures of dominance may not translate consistently across different backgrounds, potentially obscuring meaningful effects.</p>
<p>“This project taught me much about performing research in the field of evolutionary psychology, which will be of tremendous benefit as I switch labs to begin my PhD. I look forward to seeing how other researchers help to clarify the relationship between parental characteristics and offspring sex in the future.”</p>
<p>The study, “<a href="https://doi.org/10.1007/s40750-024-00254-1" target="_blank" rel="noopener">Fathers’ Facial Dominance Predicts First‑Born Sons in Parent Dyads</a>,” was authored by Benjamin J. Zubaly and Jaime L. Palmer‑Hague.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>