<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/psychology-study-sheds-light-on-the-phenomenon-of-waifus-and-husbandos/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Psychology study sheds light on the phenomenon of waifus and husbandos</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 11th 2026, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://psycnet.apa.org/doi/10.1037/ppm0000590" target="_blank" rel="noopener">Psychology of Popular Media</a></em> suggests that human romantic attraction to fictional characters may operate through the same psychological mechanisms that drive relationships between real people. The research offers insight into how individuals form deep attachments to non-existent partners in an increasingly digital world.</p>
<p>The concept of falling in love with an artificial being is not a modern invention, the researchers behind the new study noted. The ancient Greek narrative of Pygmalion describes a sculptor who creates a statue so beautiful that he falls in love with it. This theme of attributing human qualities and agency to inanimate creations has persisted throughout history.</p>
<p>In the contemporary landscape, this phenomenon is often observed within the anime fan community. Fans of Japanese animation sometimes utilize specific terminology to describe characters they hold in special regard. The terms “waifu” and “husbando” are derived from the English words for wife and husband. These labels imply a desire for a significant, often romantic, relationship with the character if they were to exist in reality.</p>
<p>The researchers conducted the new study to better understand the nature of relationships with “virtual agents.” A virtual agent is any character that exists solely on a screen but projects a sense of agency or independence to the audience. As technology advances, these characters are becoming more interactive and realistic. The authors sought to determine if the reasons people connect with these characters align with evolutionary theories regarding human mating strategies.</p>
<p>“Given the popularity of AI agents and chatbots, we were interested in people who have attraction to fictional characters,” said study author <a href="https://www.linkedin.com/in/connorleshner/" target="_blank" rel="noopener">Connor Leshner</a>, a PhD candidate in the Department of Psychology at Trent University.</p>
<p>“Through years of research, we have access to a large and charitable sample of anime fans, and it is a norm within this community to have relationships (sometimes real, sometimes now) with fictional characters. We mainly wanted to understand whether a large group of people have the capacity for relationships with fictional characters, because, if they do, then a logical future study would be studying relationships with something like AI.”</p>
<p>To investigate this, the research team recruited a large sample of self-identified anime fans. Participants were gathered from various online platforms, including specific communities on the website Reddit. The final sample consisted of 977 individuals who indicated that they currently had a waifu or husbando.</p>
<p>The demographic makeup of the sample was predominantly male. Approximately 78 percent of the respondents identified as men, while the remainder identified as women. The average age of the participants was roughly 26 years old, and more than half were from the United States. This provided a snapshot of a specific, highly engaged subculture.</p>
<p>The researchers employed a quantitative survey to assess the participants’ feelings and motivations. They asked participants to rate their agreement with various statements on a seven-point scale. The survey measured four potential reasons for choosing a specific character. These reasons were physical appearance, personality, the character’s role in the story, and the character’s similarity to the participant.</p>
<p>The researchers also sought to categorize the type of connection the fan felt toward the character. The three categories measured were emotional connection, sexual attraction, and feelings of genuine love.</p>
<p>The results provided evidence supporting the idea that fictional attraction mirrors real-world attraction. The data showed a positive association between a character’s physical appearance and the participant’s sexual attraction to them. This suggests that visual appeal is a primary driver for sexual interest in virtual agents, much as it is in human interaction.</p>
<p>However, physical appearance was not the only factor at play. The researchers found that a character’s personality was a strong predictor of emotional connection. Additionally, participants who felt that a character was similar to themselves were more likely to report a deep emotional bond. This indicates that shared traits and relatable behaviors foster feelings of closeness even when the partner is not real.</p>
<p>A central focus of the study was the influence of gender on these connections. The analysis revealed distinct differences between how men and women engaged with their chosen characters. Men were significantly more likely to report feelings of sexual attraction toward their waifus or husbandos. This aligns with prior research on male mating strategies that emphasizes visual and sexual stimuli.</p>
<p>Women, in contrast, reported higher levels of emotional connection with their fictional partners. While they also valued personality, their bonds were characterized more by affection and emotional intimacy than by sexual desire. This finding supports the hypothesis that women apply criteria focused on emotional compatibility even when the relationship is entirely imagined.</p>
<p>The researchers also explored the concept of “genuine love” for these characters. They found that feelings of love were predicted by a combination of factors. Physical appearance, personality, and similarity to the self all contributed to the sensation of being in love. This suggests that for a fan to feel love, the character must appeal to them on multiple levels simultaneously.</p>
<p>“People do have the capacity for these relationships,” Leshner told PsyPost. “Sometimes they are based in physical attraction, especially for men, while others are based on platonic, personality-based attraction, especially for women. Overall, people can feel a deep, intimate connection with people who don’t exist on our plane of reality, and I think that’s neat.”</p>
<p>The findings were not particularly surprising. “Everything matches what you’d expect from related theories, like evolutionary mating strategy where men want physical or sexual relationships, while women find more appeal in the platonic, long-term relationship,” Leshner said. “We have ongoing research that helps contextualize these findings more, but until that’s published, we cannot say much more.”</p>
<p>One potential predictor that did not yield significant results was the character’s role in the media. The “mere exposure effect” suggests that people tend to like things simply because they are familiar with them. The researchers tested if characters with larger roles, such as protagonists who appear on screen frequently, were more likely to be chosen. The data did not support this link.</p>
<p>The specific narrative function of the character did not predict sexual attraction, emotional connection, or love. A supporting character with limited screen time appeared just as capable of inspiring deep affection as a main hero. This implies that the specific attributes of the character matter more than their prominence in the story.</p>
<p>These findings carry implications that extend beyond the anime community. As artificial intelligence and robotics continue to develop, human interactions with non-human entities will likely become more common. The study suggests that people are capable of forming complex, multifaceted relationships with entities that do not physically exist.</p>
<p>“Anime characters don’t have agency, nor do they have consciousness, so the extent to which the average person might have a serious relationship with an anime characters is probably limited,” Leshner told PsyPost. “With that said, the same can is true of AI, and the <em>New York Times</em> published a <a href="https://www.nytimes.com/interactive/2025/11/05/magazine/ai-chatbot-marriage-love-romance-sex.html" target="_blank" rel="noopener">huge article</a> on human-AI romantic relationships. So maybe these relationships are more appealing than we really capture here.”</p>
<p>There are limitations to the study. The research relied on cross-sectional data, which means it captured a single moment in time. This design prevents researchers from proving that specific character traits caused the attraction. It is possible that attraction causes a participant to perceive traits differently.</p>
<p>Additionally, the sample was heavily skewed toward Western, male participants. Cultural differences in how relationships are viewed could influence these results. The anime fandom in Japan, for instance, might exhibit different patterns of attachment than those observed in the United States. Future research would benefit from a more diverse, global pool of participants.</p>
<p>Despite these limitations, the study provides a foundation for understanding the future of human connection. It challenges the notion that relationships with fictional characters are fundamentally different from real relationships. The psychological needs and drives that lead someone to download a soulmate appear to be remarkably human.</p>
<p>“People might either find these relationships weird, or might say that AI is significantly different from what we show here,” Leshner added. “My first response is that these relationships aren’t weird, and we’ve been discussing similar relationships for centuries. The article opens with a reference to Pygmalion, which is a Greek story about a guy falling in love with a statue. At minimum, it’s a repeated idea in our culture.”</p>
<p>“To my second point about the similarities between AI and anime characters, I think about it like this: AI might seem more human, but it’s just Bayesian statistics with extra steps. If you watch an anime all the way through, you can spend up to hundreds of hours with characters who have their own human struggles, triumphs, loves and losses. To be drawn toward that story and character is, to me, functionally similar to talking to an AI chatbot. The only difference is that an AI chatbot can feel more responsive, and might have more options for customization.”</p>
<p>“I think this research is foundational to the future of relationships, but I don’t think people know enough about anime characters, or really media or parasocial relationships broadly, to see things the same way,” Leshner continued. “I’m going to keep going down this road to understand the parallels with AI and modern technologies, but I fully believe that this is an uphill battle for recognition.”</p>
<p>“I hope this work inspires people to look into why people might be attracted to anime characters more broadly. It feels like the average anime character is made to be conventionally attractive in a way that is not true of most animation. It might still be weird to someone with no knowledge of the field if they engage in this quick exercise, but I have the utmost confidence that the average person might say, ‘Well, although it is not for me, I can understand it better now.'”</p>
<p>The study, “<a href="https://psycnet.apa.org/doi/10.1037/ppm0000590" target="_blank" rel="noopener">You would not download a soulmate: Attributes of fictional characters that inspire intimate connection</a>,” was authored by Connor Leshner, Stephen Reysen, Courtney N. Plante, Sharon E. Roberts, and Kathleen C. Gerbasi.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-a-common-vaccine-appears-to-have-a-surprising-impact-on-brain-health/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists: A common vaccine appears to have a surprising impact on brain health</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 11th 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new scientific commentary suggests that annual influenza vaccination could serve as a practical and accessible strategy to help delay or prevent the onset of dementia in older adults. By mitigating the risk of severe cardiovascular events and reducing systemic inflammation, the seasonal flu shot may offer neurological protection that extends well beyond respiratory health. This perspective article was published in the journal <em>Aging Clinical and Experimental Research</em>.</p>
<p>Dementia poses a significant and growing challenge to aging societies worldwide, creating an urgent need for scalable prevention strategies. While controlling midlife risk factors like high blood pressure remains a primary focus, medical experts are looking for additional tools that can be easily integrated into existing healthcare routines.</p>
<p>Lorenzo Blandi from the Vita-Salute San Raffaele University and Marco Del Riccio from the University of Florence authored this analysis to highlight the potential of influenza vaccination as a cognitive preservation tool. They argue that the current medical understanding of the flu shot is often too limited. The researchers propose that by preventing the cascade of physical damage caused by influenza, vaccination can help maintain the brain’s vascular and cellular health.</p>
<p>The rationale for this perspective stems from the observation that influenza is not merely a respiratory illness. It is a systemic infection that can cause severe complications throughout the body. The authors note that influenza infection is associated with a marked increase in the risk of heart attacks and strokes in the days following illness.</p>
<p>These vascular events are known to contribute to cumulative brain injury. Consequently, Blandi and Del Riccio sought to synthesize existing evidence linking vaccination to improved cognitive outcomes. They posit that preventing these viral insults could modify the trajectory of dementia risk in the elderly population.</p>
<p>To support their argument, the authors detail evidence from four major epidemiological studies that demonstrate a link between receiving the flu shot and a lower incidence of dementia. The first piece of evidence cited is <a href="https://doi.org/10.3233/JAD-221036" target="_blank" rel="noopener">a 2023 meta-analysis</a>. This massive review aggregated data from observational cohort studies involving approximately 2.09 million adults.</p>
<p>The participants in these studies were followed for periods ranging from four to thirteen years. The analysis found that individuals who received influenza vaccinations had a 31 percent lower risk of developing incident dementia compared to those who did not.</p>
<p>The second key study referenced was a <a href="https://doi.org/10.3233/JAD-220361" target="_blank" rel="noopener">claims-based cohort study</a>. This research utilized propensity-score matching, a statistical technique designed to create comparable groups by accounting for various baseline characteristics. The researchers analyzed data from 935,887 matched pairs of older adults who were at least 65 years old.</p>
<p>The results showed that those who had received an influenza vaccination had a 40 percent lower relative risk of developing Alzheimer’s disease over a follow-up period of roughly four years. The study calculated an absolute risk reduction of 3.4 percent, suggesting that for every 29 people vaccinated, one case of Alzheimer’s might be prevented during that timeframe.</p>
<p>The <a href="https://doi.org/10.1016/j.vaccine.2021.08.046" target="_blank" rel="noopener">third study</a> highlighted in the perspective used data from the Veterans Health Administration. This study was significant because it used time-to-event models to address potential biases related to when vaccinations occurred.</p>
<p>The researchers found that vaccinated older adults had a hazard ratio for dementia of 0.86. This statistic indicates a risk reduction of roughly 14 percent. The data also revealed a dose-response relationship. This means that the protective signal was strongest among participants who received multiple vaccine doses across different years and seasons, rather than just a single shot.</p>
<p>The fourth and final study cited was a <a href="https://doi.org/10.1038/s41541-024-00841-z" target="_blank" rel="noopener">prospective analysis of the UK Biobank</a>. This study modeled vaccination as an exposure that varies over time, allowing for a nuanced view of cumulative effects.</p>
<p>The researchers observed a reduced risk for all-cause dementia, with a hazard ratio of 0.83. The reduction in risk was even more pronounced for vascular dementia, showing a hazard ratio of 0.58. Similar to the veterans’ study, this analysis supported the idea of a dose-response relationship. The accumulation of vaccinations over time appeared to correlate with better cognitive outcomes.</p>
<p>Blandi and Del Riccio explain several biological mechanisms that could account for these protective effects. The primary pathway involves the prevention of vascular damage. Influenza infection is a potent trigger for inflammation and blood clotting.</p>
<p>Research shows that the risk of acute myocardial infarction can be six times greater in the first week after a flu infection. By preventing the flu, the vaccine likely prevents these specific vascular assaults. Since vascular health is closely tied to brain health, avoiding these events helps preserve cognitive reserve. The cumulative burden of small strokes or reduced blood flow to the brain is a major predictor of cognitive decline.</p>
<p>In addition to vascular protection, the authors discuss the role of neuroinflammation. Studies in animal models have shown that influenza viruses can trigger activation of microglia, which are the immune cells of the brain. This activation can lead to the loss of synapses and memory decline, even if the virus itself does not enter the brain.</p>
<p>Systemic inflammation caused by the flu can cross into the nervous system. The authors suggest that vaccination may dampen these inflammatory surges. There is also a hypothesis known as “trained immunity,” where vaccines might program the immune system to respond more efficiently to threats, reducing off-target damage to the brain.</p>
<p>Based on this evidence, the authors propose several policy changes and organizational strategies. They argue that public health messaging needs to be reconceptualized. Instead of framing the flu shot solely as a way to avoid a winter cold, health officials should present it as a measure to reduce heart attacks, strokes, and potential cognitive decline. This approach addresses the priorities of older adults, who often fear dementia and loss of independence more than respiratory illness.</p>
<p>The authors also recommend specific clinical practices. They suggest that health systems should prioritize the use of high-dose or adjuvanted vaccines for adults over the age of 65. These formulations are designed to overcome the weaker immune response often seen in aging bodies.</p>
<p>Additionally, the authors advocate for making vaccination a default part of hospital discharge procedures. When an older adult is leaving the hospital after a cardiac or pulmonary event, vaccination should be a standard component of their care plan. This would help close the gap between the known benefits of the vaccine and the currently low rates of uptake in many regions.</p>
<p>Despite the promising data, Blandi and Del Riccio acknowledge certain limitations in the current body of evidence. The majority of the data comes from observational studies. This type of research can identify associations but cannot definitively prove causality.</p>
<p>There is always a possibility of “healthy user bias,” where people who choose to get vaccinated are already more health-conscious and have better lifestyle habits than those who do not. While the studies cited used advanced statistical methods to control for these factors, residual confounding can still exist.</p>
<p>The authors also note that studies based on medical claims data can suffer from inaccuracies in how dementia is diagnosed and recorded. Furthermore, the precise biological mechanisms remain a hypothesis that requires further validation. The authors call for future research to include pragmatic randomized trials that specifically measure cognitive endpoints. They suggest that future studies should track biological markers of neuroinflammation in vaccinated versus unvaccinated groups to confirm the proposed mechanisms.</p>
<p>The study, “<a href="https://doi.org/10.1007/s40520-026-03323-5" target="_blank" rel="noopener">From breath to brain: influenza vaccination as a pragmatic strategy for dementia prevention</a>,” was authored by Lorenzo Blandi and Marco Del Riccio.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/staying-off-social-media-isnt-always-a-sign-of-a-healthy-social-life/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Staying off social media isn’t always a sign of a healthy social life</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 10th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research suggests that the way adolescents use social media is not a uniform experience but rather splits into distinct personality-driven profiles that yield varying social results. The findings indicate that digital platforms largely reinforce existing friendships rather than helping isolated youth build new connections. These results were published in the journal <em><a href="https://doi.org/10.1016/j.chb.2025.108880" target="_blank">Computers in Human Behavior</a></em>.</p>
<p>Psychologists have debated whether apps like Instagram, TikTok, or Snapchat help or harm adolescent development for years. Some theories propose that these platforms simulate meaningful connection and allow young people to practice social skills. Other perspectives argue that digital interactions replace face-to-face communication with superficial scrolling, leading to isolation.</p>
<p>However, most previous inquiries looked at average behaviors across large groups or focused on simple metrics like screen time. This approach often misses the nuance of individual habits. Real-world usage is rarely just about logging on or logging off. It involves a mix of browsing, posting, liking, and chatting.</p>
<p>Federica Angelini, the lead author from the Department of Developmental and Social Psychology at the University of Padova in Italy, worked with colleagues to move beyond these binary categories. They wanted to understand how specific combinations of online behaviors cluster together. They also sought to determine if a teenager’s underlying social motivations drive these habits.</p>
<p>The research team recognized that early adolescence is a formative period for social and emotional growth. During these years, close relationships with peers become central to a young person’s identity. Because these interactions now occur simultaneously in physical and digital spaces, the authors argued that science needs better models to capture this complexity.</p>
<p>To achieve this, the team tracked 1,211 Dutch students between the ages of 10 and 15 over the course of three years. They used surveys to measure how often students looked at content, posted about themselves, interacted with others, and shared personal feelings. The researchers also assessed the students’ psychological motivations, such as the fear of missing out or a desire for popularity.</p>
<p>Using a statistical technique called latent profile analysis, the investigators identified four distinct types of users. The largest group, comprising about 54 percent of the participants, was labeled “All-round users.” These teens engaged in a moderate amount of all activities, from scrolling to posting.</p>
<p>The study found that All-round users generally maintained moderate-to-high quality friendships throughout the three-year period. Their digital habits appeared to be an extension of a healthy offline social life. They used these platforms to keep in touch and share experiences with friends they already saw in person.</p>
<p>The second largest group, making up roughly 30 percent, was identified as “Low users.” These individuals rarely engaged with social media in any form, whether passive scrolling or active posting. While it might seem beneficial to be less dependent on screens, the data showed a different story for this specific group.</p>
<p>These Low users reported lower quality friendships at the start of the research compared to their peers. Their lack of online engagement appeared to mirror a lack of connection in the real world. Without a strong peer group to interact with, they had little motivation to log on. The data suggests they were not simply opting out of technology but were missing out on the social reinforcement that happens online.</p>
<p>A smaller group, about 8 percent, was termed “High self-disclosing users.” These adolescents frequently used digital platforms to share personal feelings, secrets, and emotional updates. They tended to prefer online communication over face-to-face talk.</p>
<p>This group scored higher on measures of anxiety and depression. The researchers suggest these teens might use the internet to compensate for difficulties in offline social situations. The reduced pressure of online chat, which lacks nonverbal cues like eye contact, may make it easier for them to open up. Despite their emotional struggles, this group maintained high-quality friendships, suggesting their vulnerability online helped sustain their bonds.</p>
<p>The final group, labeled “High self-oriented users,” made up roughly 7 percent of the sample. These teens focused heavily on posting content about themselves but showed less interest in what peers were doing. They were driven by a desire for status and attention.</p>
<p>Unlike the other groups, High self-oriented users were less concerned with the fear of missing out. Their primary goal appeared to be self-promotion rather than connection. Notably, this was the only group that saw a decline in the quality of their close friendships over the three years. Their focus on gaining an audience rather than engaging in reciprocal friendship likely failed to deepen their personal relationships.</p>
<p>The analysis revealed that social media generally acts as an amplifier of offline social dynamics. Teens with strong existing friendships used the platforms to maintain those bonds. Those with weaker connections did not seem to benefit from the technology.</p>
<p>This supports the idea that the benefits of social media rely heavily on pre-existing relationships. Adolescents who struggle socially in person may find it difficult to use these tools to build meaningful relationships from scratch. Instead of bridging the gap, the technology might leave them further behind.</p>
<p>The study also highlighted the role of motivation. Teens who used social media to seek status were more likely to fall into the self-oriented or self-disclosing categories. Those who simply wanted to stay in the loop tended to be All-round users.</p>
<p>There are limitations to consider regarding this research. The data relied on self-reported surveys, which can sometimes be inaccurate as people may not remember their habits perfectly. Additionally, the study was conducted in the Netherlands, so the results might not apply universally to adolescents in other cultural contexts.</p>
<p>The researchers noted that some participants dropped out of the study over the three years, which is common in longitudinal work. The study also did not strictly differentiate between friends met online versus friends met offline, though most participants indicated they communicated with people they knew in real life.</p>
<p>Future research could benefit from using objective measures, such as tracking app usage data directly from smartphones. It would also be beneficial to investigate how these profiles evolve as teens move into young adulthood. Understanding these patterns could help parents and educators tailor their advice, rather than giving generic warnings about screen time.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.chb.2025.108880" target="_blank">Adolescent social media use profiles: A longitudinal study of friendship quality and socio-motivational factors</a>,” was authored by Federica Angelini, Ina M. Koning, Gianluca Gini, Claudia Marino, and Regina J.J.M. van den Eijnden.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/moderate-coffee-and-tea-consumption-linked-to-lower-risk-of-dementia/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Moderate coffee and tea consumption linked to lower risk of dementia</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 10th 2026, 17:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new analysis of long-term dietary habits suggests that your daily cup of coffee or tea might do more than just provide a morning jolt. Researchers have determined that moderate consumption of caffeinated beverages is linked to a lower risk of dementia and better physical brain function over time. These results were published in the journal <em><a href="https://jamanetwork.com/journals/jama/article-abstract/2844764" target="_blank" rel="noopener">JAMA</a></em>.</p>
<p>Dementia and Alzheimer’s disease represent a growing health challenge as the global population ages. Current medical treatments offer limited benefits once symptoms appear, and they cannot reverse the condition. This reality has prompted medical experts to look for lifestyle habits that might delay the onset of cognitive decline. Diet is a primary area of focus because it is a factor that individuals can control in their daily lives.</p>
<p>Coffee and tea are of particular interest to nutritional scientists. These beverages contain chemical compounds that may protect brain cells from damage. These include caffeine and polyphenols, which are plant-based micronutrients with antioxidant properties.</p>
<p>Prior attempts to measure this potential benefit have yielded mixed results. Some earlier inquiries relied on participants remembering their dietary habits from the distant past. Others checked in with participants only once, failing to capture how habits change over a lifetime. To address these limitations, a team led by Yu Zhang and Daniel Wang from the Harvard T.H. Chan School of Public Health and Mass General Brigham undertook a more expansive approach.</p>
<p>The investigators analyzed data from two massive, long-running groups of medical professionals. The study included over 130,000 female nurses and male health professionals who provided updates on their health and diet for up to forty-three years. Unlike smaller snapshots of time, this project tracked dietary habits repeatedly. Participants filled out detailed questionnaires about what they ate and drank every two to four years.</p>
<p>This distinct method allowed the researchers to reduce errors associated with memory. It also helped them calculate a cumulative average of caffeine intake over decades. The team looked for associations between these drinking habits and three specific outcomes: the clinical diagnosis of dementia, self-reported memory problems, and performance on objective cognitive tests.</p>
<p>The data revealed a distinct pattern regarding the consumption of caffeinated beverages. Individuals who drank caffeinated coffee had a lower chance of developing dementia compared to those who avoided it. The relationship followed a specific curve rather than a straight line.</p>
<p>The greatest reduction in risk appeared among people who drank approximately two to three cups of caffeinated coffee per day. Consuming more than this amount did not result in additional benefits, but it also did not appear to cause harm. This finding contradicts some earlier fears that high caffeine intake might be detrimental to the aging brain.</p>
<p>Tea drinkers saw similar benefits. Consuming one to two cups of tea daily was linked to a lower likelihood of dementia diagnosis. In contrast, the researchers found that the results were not statistically significant among those who drank decaffeinated coffee. This distinction suggests that caffeine itself may play a central role in the observed neuroprotection.</p>
<p>The study also looked at how well participants could think and remember as they aged. In a subset of the participants who underwent telephone-based testing, higher caffeinated coffee intake tracked with better scores on performance tasks. These tests measured verbal memory, attention, and executive function.</p>
<p>The difference in scores was roughly equivalent to being several months younger in terms of brain aging. Even among people who carried genes that usually increase the risk of Alzheimer’s, the link between caffeine and better brain health remained consistent. The researchers also assessed “subjective cognitive decline.” This is a stage where individuals feel they are having memory slips before a doctor can detect them. Higher caffeine intake was associated with fewer reports of these subjective problems.</p>
<p>These results add weight to a growing body of evidence linking caffeine to neurological health. However, the findings do not perfectly align with every previous study. For example, recent analyses of the UK Biobank database also found that coffee drinkers had a lower risk of neurodegenerative conditions. That research highlighted that <a href="https://www.psypost.org/unsweetened-coffee-associated-with-reduced-risk-of-alzheimers-and-parkinsons-diseases-study-finds/" target="_blank" rel="noopener">unsweetened coffee seemed most beneficial</a>.</p>
<p>The UK Biobank findings differed slightly regarding decaffeinated coffee. While the Harvard team found no link between decaf and dementia risk, the UK study suggested decaf might still offer some protection. This discrepancy implies that other compounds in coffee besides caffeine might play a role, or that different populations metabolize these beverages differently.</p>
<p>Other research utilizing brain imaging has offered clues about why this might happen. A study from the Australian Imaging, Biomarkers and Lifestyle study of aging found that higher coffee consumption was associated with a <a href="https://www.psypost.org/study-offers-new-evidence-that-drinking-coffee-can-protect-against-alzheimers-disease/" target="_blank" rel="noopener">slower buildup of amyloid proteins in the brain</a>. These proteins are the sticky clumps associated with Alzheimer’s disease.</p>
<p>The new Harvard study aligns with the theory that caffeine helps maintain neural networks. It supports the idea that moderate stimulation of the brain’s chemical receptors might reduce inflammation. Caffeine blocks specific receptors in the brain known as adenosine receptors. When these receptors are blocked, it affects the release of neurotransmitters and may reduce the stress on brain cells.</p>
<p>Researchers have also observed in animal models that caffeine can suppress the enzymes that create amyloid plaques. It appears to enhance the function of mitochondria, which are the power plants of the cell. By improving how brain cells use energy, caffeine might help them survive longer in the face of aging.</p>
<p>Additional context comes from the National Health and Nutrition Examination Survey in the United States. That separate analysis found that older adults who consumed more caffeine <a href="https://www.psypost.org/higher-caffeine-intake-linked-to-better-cognitive-function-in-older-u-s-adults-study-finds/" target="_blank" rel="noopener">performed better on tests of processing speed and attention</a>. The consistency of these findings across different populations strengthens the argument that caffeine has a measurable effect on cognition.</p>
<p>Despite the large sample size of the new Harvard analysis, the study has limitations inherent to observational research. It demonstrates an association but cannot definitively prove that coffee causes the reduction in dementia cases. It is possible that people who start to experience subtle cognitive decline naturally stop drinking coffee before they are diagnosed. This phenomenon is often called reverse causation.</p>
<p>The researchers attempted to account for this by conducting sensitivity analyses. They looked at the data in ways that excluded the years immediately preceding a diagnosis. The protective link remained, suggesting that reverse causation does not fully explain the results.</p>
<p>The participants in this study were primarily white medical professionals. This fact means the results might not apply perfectly to the general population or to other racial and ethnic groups. Additionally, the questionnaires did not distinguish between different preparation methods. The study could not separate the effects of espresso versus drip coffee, or green tea versus black tea.</p>
<p>Unmeasured factors could also be at play. Coffee drinkers might share other lifestyle habits that protect the brain, such as higher levels of social activity or different dietary patterns. The researchers used statistical models to adjust for smoking, exercise, and overall diet quality. However, observational studies can never fully eliminate the possibility of residual confounding variables.</p>
<p>Future science needs to clarify the biological mechanisms at play. Researchers must determine if caffeine is acting alone or in concert with other antioxidants found in these plants. Clinical trials that assign specific amounts of caffeine to participants could help confirm these observational findings.</p>
<p>The senior author of the study, Daniel Wang, noted the perspective needed when interpreting these results. “While our results are encouraging, it’s important to remember that the effect size is small and there are lots of important ways to protect cognitive function as we age,” Wang said. “Our study suggests that caffeinated coffee or tea consumption can be one piece of that puzzle.”</p>
<p>For now, the data suggests that a moderate coffee or tea habit is a generally healthy choice for the aging brain. It appears that consumption of about three cups of coffee or two cups of tea provides the maximum potential benefit. This study provides reassurance that this common daily ritual does not harm cognitive function and may help preserve it.</p>
<p>The study, “<a href="https://jamanetwork.com/journals/jama/article-abstract/2844764" target="_blank" rel="noopener">Coffee and Tea Intake, Dementia Risk, and Cognitive Function</a>,” was authored by Yu Zhang, Yuxi Liu, Yanping Li, Yuhan Li, Xiao Gu, Jae H. Kang, A. Heather Eliassen, Molin Wang, Eric B. Rimm, Walter C. Willett, Frank B. Hu, Meir J. Stampfer, and Dong D. Wang.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/severe-teen-adhd-symptoms-predict-lower-income-and-higher-arrest-rates-by-age-40/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Severe teen ADHD symptoms predict lower income and higher arrest rates by age 40</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 10th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A longitudinal study in Christchurch, New Zealand found that individuals who displayed the most severe ADHD symptoms as adolescents were at an elevated risk of developing substance use disorder, depression, and suicidal ideation in early adulthood. They were also more likely to engage in crime and be unemployed. These individuals tended to have lower income and living standards, and less stable relationships. The paper was published in the <a href="https://doi.org/10.1192/bjp.2025.10475"><em>British Journal of Psychiatry</em></a>.</p>
<p>Attention-deficit/hyperactivity disorder, or ADHD, is a neurodevelopmental condition characterized by persistent patterns of inattention, hyperactivity, and/or impulsivity that interfere with daily functioning or development. It typically begins in childhood, although many individuals continue to experience symptoms into adolescence and adulthood. Most often, ADHD is diagnosed when an individual starts school as behavior caused by ADHD comes into conflict with school rules.</p>
<p>ADHD is more commonly diagnosed in males, although females are often underdiagnosed due to less overt symptoms. Genetic factors play a major role in ADHD. ADHD often co-occurs with other conditions such as learning disorders, anxiety, depression, or oppositional behavior. Symptoms can significantly adversely affect academic performance, work productivity, and social relationships.</p>
<p>Study author James A. Foulds and his colleagues used data from a 40-year longitudinal study of a birth cohort in Christchurch, New Zealand to estimate the association between ADHD symptoms in adolescence and a broad range of mental health and psychosocial outcomes in early adulthood up to 40 years of age.</p>
<p>Data used in this analysis came from the Christchurch Health and Development Study. This study enrolled 1,265 individuals born in Christchurch in 1977 and assessed them annually from birth to 16 years of age. After that, data were collected when participants were 18, 21, 25, 30, 35, and 40 years of age. In the final three data collection waves, 75–80% of surviving study participants provided their data.</p>
<p>Data used in this analysis were assessments of ADHD symptoms, conduct disorder, and oppositional defiant disorder symptoms when participants were 14–16 years of age. Of the data collected between 16 and 40 years of participants’ age, study authors used information on substance use disorders (alcohol and cannabis), illicit drug use, and internalizing mental health problems.</p>
<p>Internalizing mental health problems are psychological difficulties characterized by inwardly directed distress, such as anxiety, depression, and withdrawal. Of the data collected between participants’ 25 and 40 years of age, this analysis used information on participants’ unemployment (lasting at least 3 months), relationship breakdowns, income, and home ownership.</p>
<p>Results showed that the 25% of participants with the most severe ADHD symptoms in adolescence were more likely to smoke tobacco (34% vs 15%), fulfill criteria for alcohol use disorder (26% vs 14%), and cannabis use disorder (18% vs 7%) compared to participants with less severe or no ADHD symptoms.</p>
<p>These individuals also more often met the criteria for major depression (29% vs 19%), anxiety disorders, and suicidal ideation. They were more likely to have been arrested (9% vs 3%), more likely to have engaged in both violent and property crime, and were more often unemployed. Participants who had the most severe ADHD symptoms as adolescents owned their homes less often and tended to have lower personal income. They more often reported breakdowns of relationships.</p>
<p>“Higher levels of adolescent ADHD symptoms are associated with substance use problems and criminal offending in adulthood. Long-term secondary prevention activities are needed to detect and manage coexisting problems among adults with a history of ADHD,” the study authors concluded.</p>
<p>The study sheds light on the links between ADHD symptoms in adolescence and key life outcomes in adulthood. However, it should be noted that the study was conducted on a group of individuals born in a single city (Christchurch) in the same year (1977), meaning that the observed associations may be affected by cultural and social specificities of Christchurch during the studied period. Results in other cultures and other historical periods may differ.</p>
<p>The study authors also note that only 5 participants were prescribed stimulant medication for their ADHD. This differs from the modern situation where people suffering from ADHD receive medication for their condition much more often. Finally, it remains unknown how much the association with the outcomes is due to ADHD symptoms and not due to other co-occurring conditions like autism spectrum disorder, a condition that was largely not diagnosed properly in the 1970s.</p>
<p>The paper, “<a href="https://doi.org/10.1192/bjp.2025.10475">Long-term outcomes associated with adolescent ADHD symptomatology: birth cohort study,</a>” was authored by James A. Foulds, Joseph M. Boden, Jessica A. Kerr, Katie M. Douglas, Michaela Pettie, Jesse T. Young, Mairin R. Taylor, Katherine Donovan, and Richard Porter.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/physical-distance-shapes-moral-choices-in-sacrificial-dilemmas/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Physical distance shapes moral choices in sacrificial dilemmas</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 10th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>When people feel physically closer to someone who could be harmed, they are less willing to sacrifice that person for the greater good, according to a new finding reported in <a href="https://doi.org/10.1080/02699931.2025.2484358"><em>Cognition & Emotion</em></a>.</p>
<p>Moral dilemmas, situations where any available option violates an important moral value, have been used to study how people balance rules like “do not harm” against outcomes like saving more lives. Classic examples such as the <a href="https://www.psypost.org/three-problems-with-using-the-trolley-dilemma-in-moral-philosophy/">trolley and footbridge dilemmas</a> show that people often reject utilitarian solutions when harm requires direct physical contact, suggesting that emotional responses play an important role in moral judgment.</p>
<p>The trolley dilemma is a thought experiment that asks whether it is morally permissible to pull a lever to divert a runaway train, sacrificing one person on a side track to save five people on the main line. The footbridge dilemma modifies this scenario by asking if one would physically push a large person off a bridge to stop the train, rather than using a mechanical switch. </p>
<p>Federica Alfeo and colleagues were motivated by an open question in this literature: is it the <em>type of action</em> (e.g., pushing versus pulling a lever), or the <em>physical closeness</em> to the victim, that drives these moral choices? Building on theories of psychological distance and prior work on emotion in decision-making, the authors set out to disentangle how proximity itself shapes moral judgments and emotional reactions.</p>
<p>The researchers conducted two studies using computer-based, interactive moral dilemmas modeled on the footbridge scenario. The scenarios were presented from a first-person perspective, allowing participants to experience the unfolding situation as if they themselves were at the scene.</p>
<p>In Study 1, 261 participants responded to scenarios that required different actions implying different levels of physical proximity to a victim: pushing someone directly, using a gun, or pulling a lever that opened a trapdoor. Participants made a forced choice between a deontological option (letting five people die) and a utilitarian option (sacrificing one person), while their response times were recorded.</p>
<p>After each scenario, participants estimated how physically close they felt to the victim using a visual distance scale. They also rated their emotional responses using standardized ratings, spanning negative emotions (e.g., fear, anger, sadness), moral emotions (e.g., guilt, shame, regret), and positive or neutral emotions. Importantly, emotions were assessed both for the option participants chose (factual emotions) and for the option they rejected (counterfactual emotions).</p>
<p>In Study 2, the researchers tested 46 additional participants to further isolate proximity. Here, the action remained constant across all scenarios (pulling a lever), while only the visual distance to the victim was manipulated. This design allowed the authors to examine whether perceived proximity alone, without changing the action, was sufficient to alter moral choices and emotional reactions.</p>
<p>Across both studies, participants reliably perceived the intended differences in physical distance, confirming that the proximity manipulations worked as designed. In Study 1, moral choices varied systematically with proximity. Participants were less willing to endorse the utilitarian option when the scenario required closer physical contact with the potential victim.</p>
<p>When harm felt more immediate and personal, participants tended to favor deontological choices, even when those choices resulted in worse overall outcomes. Scenarios implying greater distance, by contrast, were associated with a higher likelihood of sacrificing one person to save five.</p>
<p>Emotional responses mirrored these decision patterns. Negative emotions and moral emotions, including guilt, shame, regret, and disappointment, were strongest in high-proximity scenarios and weakest when the victim was farther away. Importantly, emotions associated with the unchosen alternative were consistently more intense than emotions linked to the chosen action.</p>
<p>This pattern suggests that participants anticipated the emotional consequences of both options and tended to choose the one expected to minimize emotional distress. Response times did not meaningfully differ across proximity levels, indicating that emotional intensity rather than deliberation time distinguished the scenarios.</p>
<p>Study 2 replicated and clarified these effects while holding the action constant. Even when participants always performed the same action, greater perceived distance increased utilitarian responding, whereas closer proximity reduced it. Emotional patterns showed a similar structure, with proximity amplifying negative and moral emotions and counterfactual emotions again exceeding factual ones.</p>
<p>Together, these findings show that physical closeness itself, not just the type of action, plays a central role in moral decision-making.</p>
<p>These findings are based on hypothetical, computer-based dilemmas, which may not fully capture how people behave in real-world moral situations involving genuine stakes and consequences.</p>
<p>The research, “<a href="https://doi.org/10.1080/02699931.2025.2484358">The closer you are, the more it hurts: the impact of proximity on moral decision-making</a>,” was authored by Federica Alfeo, Antonietta Curci, and Tiziana Lanciano.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/does-sexual-activity-before-exercise-harm-athletic-performance/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Does sexual activity before exercise harm athletic performance?</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 10th 2026, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in the journal <em><a href="https://doi.org/10.1016/j.physbeh.2025.115203" target="_blank">Physiology & Behavior</a></em> provides evidence that sexual activity shortly before high-intensity exercise does not harm athletic performance. The study suggests that masturbation-induced orgasm 30 minutes prior to exertion may actually enhance exercise duration and reaction time. These findings challenge long-standing beliefs regarding the necessity of sexual abstinence before athletic competition.</p>
<p>The motivation for the new study stems from a persistent debate in the sports world. Coaches and athletes have frequently adhered to the idea that sexual activity drains energy and reduces aggression. This belief has led to common recommendations for abstinence in the days leading up to major events. Diego Fernández-Lázaro from the University of Valladolid led a research team to investigate whether these restrictions are scientifically justified.</p>
<p>Previous scientific literature on this topic has been inconsistent or limited in scope. Many prior studies focused on sexual activity occurring the night before competition, leaving a gap in knowledge regarding immediate effects. Fernández-Lázaro and his colleagues aimed to examine the physiological and performance outcomes of sexual activity that occurs less than an hour before maximal effort.</p>
<p>To conduct the investigation, the researchers recruited 21 healthy, well-trained male athletes. The participants included basketball players, long-distance runners, and boxers. The average age of the volunteers was 22 years. The study utilized a randomized crossover design to ensure robust comparisons. This means that every participant completed both the experimental condition and the control condition.</p>
<p>In the control condition, participants abstained from any sexual activity for at least seven days. On the day of testing, they watched a neutral documentary film for 15 minutes before beginning the exercise assessments. In the experimental condition, the participants engaged in masturbation to orgasm in a private setting 30 minutes before the tests. They viewed a standardized erotic film to facilitate this process. Afterward, they watched the same neutral documentary to standardize the rest period.</p>
<p>The researchers employed two primary physical tests to measure performance. The first was an isometric handgrip strength test using a dynamometer. The second was an incremental cycling test performed on a stationary bike. The cycling test began at a set resistance and increased in difficulty every minute until the participant could no longer continue. This type of test is designed to measure aerobic capacity and time to exhaustion.</p>
<p>In addition to physical performance, the team collected blood samples to analyze various biomarkers. They looked for changes in hormones such as testosterone, cortisol, and luteinizing hormone. They also measured markers of muscle damage, including creatine kinase and lactate dehydrogenase. Inflammatory markers like C-reactive protein were also assessed to see if sexual activity placed additional stress on the body.</p>
<p>The results indicated that sexual activity did not have a negative impact on physical capabilities. The participants demonstrated a small but statistically significant increase in the total duration of the cycling test following sexual activity compared to the abstinence condition. This improvement represented a 3.2 percent increase in performance time.</p>
<p>The researchers also observed changes in handgrip strength. The mean strength values were slightly higher in the group that had engaged in sexual activity. This suggests that the neuromuscular system remained fully functional and perhaps slightly primed for action.</p>
<p>Physiological monitoring revealed that heart rates were higher during the exercise sessions that followed sexual activity. This elevation in heart rate aligns with the activation of the sympathetic nervous system. This system is responsible for the “fight or flight” response that prepares the body for physical exertion.</p>
<p>Hormonal analysis provided further insight into the body’s response. The study found that concentrations of both testosterone and cortisol were higher after sexual activity. Testosterone is an anabolic hormone associated with strength and aggression. Cortisol is a stress hormone that helps mobilize energy stores. The simultaneous rise in both hormones indicates a state of physiological activation rather than a state of fatigue.</p>
<p>The study also examined markers of muscle damage to see if the combination of sex and exercise caused more tissue stress. The findings showed that levels of lactate dehydrogenase were actually lower in the sexual activity condition. This specific enzyme leaks into the blood when muscle cells are damaged or stressed. The reduction suggests that the pre-exercise sexual activity did not exacerbate muscle stress and may have had a protective or neutral effect.</p>
<p>Other markers of muscle damage, such as creatine kinase and myoglobin, showed no significant differences between the two conditions. Similarly, inflammatory markers like interleukin-6 remained stable. This implies that the short-term physiological stress of sexual activity does not compound the stress caused by the exercise itself.</p>
<p>These findings diverge from some historical perspectives and specific past studies. For example, a study by Kirecci and colleagues reported that sexual intercourse within 24 hours of exercise <a href="https://doi.org/10.1136/postgradmedj-2020-139033" target="_blank">reduced lower limb strength</a>. The current study contradicts that conclusion by showing maintained or improved strength. The difference may lie in the specific timing or the nature of the sexual activity, as the current study focused on masturbation rather than partnered intercourse.</p>
<p>The results align more closely with a body of research <a href="https://doi.org/10.1038/s41598-022-19882-2" target="_blank">summarized by Zavorsky</a> and <a href="https://www.psypost.org/physical-exercise-performance-is-not-affected-by-having-sex-the-night-before/" target="_blank">others</a>. Those reviews generally concluded that sexual activity the night before competition has little to no impact on performance. The current study builds on that foundation by narrowing the window to just 30 minutes. It provides evidence that even immediate pre-competition sexual activity is not detrimental.</p>
<p>The researchers propose that the observed effects are likely due to a “priming” mechanism. Sexual arousal activates the sympathetic nervous system and triggers the release of catecholamines. This physiological cascade resembles a warm-up. It increases heart rate and alertness, which may translate into better readiness for immediate physical exertion.</p>
<p>The psychological aspect of the findings is also worth noting. The participants did not report any difference in their perceived rate of exertion between the two conditions. This means the exercise did not feel harder after sexual activity, even though their heart rates were higher. This consistency suggests that motivation and psychological fatigue were not negatively affected.</p>
<p>There are limitations to this study that affect how the results should be interpreted. The sample consisted entirely of young, well-trained men. Consequently, the findings may not apply to female athletes, older adults, or those with lower fitness levels. The physiological responses to sexual activity can vary across these different demographics.</p>
<p>The study restricted sexual activity to masturbation to maintain experimental control. Partnered sexual intercourse involves different physical demands and psychological dynamics. Intercourse often requires more energy expenditure and involves oxytocin release related to bonding, which might influence sedation or relaxation differently than masturbation.</p>
<p>The sample size of 21 participants is relatively small, although adequate for a crossover design of this nature. Larger studies would be needed to confirm these results and explore potential nuances. The study also relied on a one-week washout period between trials. While this is standard, residual psychological effects from the first session cannot be entirely ruled out.</p>
<p>Future research should aim to include female participants to determine if similar hormonal and performance patterns exist. It would also be beneficial to investigate different time intervals between sexual activity and exercise. Understanding the effects of partnered sex versus masturbation remains a key area for further exploration.</p>
<p>The study provides evidence that the “abstinence myth” may be unfounded for many athletes. The data indicates that sexual activity 30 minutes before exercise does not induce fatigue or muscle damage. Instead, it appears to trigger a neuroendocrine response that supports physical performance. Athletes and coaches may need to reconsider strict abstinence policies based on these physiological observations.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.physbeh.2025.115203" target="_blank">Sexual activity before exercise influences physiological response and sports performance in high-level trained men athletes</a>,” was authored by Diego Fernández-Lázaro, Manuel Garrosa, Gema Santamaría, Enrique Roche, José María Izquierdo, Jesús Seco-Calvo, and Juan Mielgo-Ayuso.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/neuroimaging-data-reveals-a-common-currency-for-effective-communication/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Neuroimaging data reveals a “common currency” for effective communication</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 10th 2026, 11:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1093/pnasnexus/pgaf287" target="_blank">PNAS Nexus</a></em> has found that specific patterns of brain activity can predict the success of persuasive messages across a wide variety of contexts. By analyzing neuroimaging data from over 500 individuals, researchers identified that neural responses in regions associated with reward and social processing are consistent indicators of how effective a message will be. These findings suggest that the human brain utilizes a common set of mechanisms to evaluate persuasive content.</p>
<p>Diverse fields such as marketing, political science, and public health rely heavily on the ability to influence attitudes and behaviors through mass media. Practitioners and scientists have long sought to understand exactly what makes a message persuasive enough to change a mind or prompt an action. </p>
<p>Previous research on this topic has typically been isolated within specific disciplines, preventing the development of a unified theory that applies across different topics. This fragmentation makes it difficult to know if the psychological drivers behind a successful anti-smoking ad are the same as those driving a popular movie trailer. The authors of the current study aimed to bridge this gap by applying a standardized analytical framework to a large collection of existing datasets.</p>
<p>“Persuasive messages—like those used in marketing, politics, or public health campaigns—play a key role in shaping attitudes and influencing behavior. But what exactly makes these messages effective, and do the same processes apply across different contexts? We don’t fully know, because research on persuasion tends to stay within individual disciplines, with little cross-talk,” explained the corresponding authors, Christin Scholz, Hang-Yee Chan, and Emily B. Falk.</p>
<p>“If we could identify common processes, different fields could work together more efficiently to understand what really drives persuasion. In this study, we examine neuroimaging data collected in response to a variety of persuasive messages. MRI brain images offer a way to observe and compare patterns of brain activity across different contexts. By conducting a mega-analysis of 16 datasets, we aimed to uncover broader patterns in how the brain responds to persuasive messages—patterns that individual studies might overlook.”</p>
<p>The research team conducted a mega-analysis, which differs from a traditional meta-analysis by aggregating and re-processing the raw data from multiple studies rather than simply summarizing their published results. They pooled functional magnetic resonance imaging (fMRI) data from 16 distinct experiments conducted by the co-authors. This combined dataset included 572 participants who were exposed to a total of 739 different persuasive messages.</p>
<p>The scope of the messages was broad, covering topics such as public health promotion, crowdfunding projects, commercial products, and video, text, or image-based advertisements. The total dataset comprised 21,688 individual experimental trials. In each of the original studies, participants lay inside an MRI scanner while viewing the messages. The scanner recorded changes in blood flow to various parts of the brain, which serves as a proxy for neural activity.</p>
<p>After viewing the content, the participants provided their own evaluations of the messages. They typically answered survey questions about how much they liked the message or whether they intended to change their behavior. The researchers categorized these self-reported measures as “message effectiveness in individuals.”</p>
<p>To assess the real-world impact of the content, the team also gathered data on how independent, larger groups of people responded to the same messages. These measures were termed “message effectiveness at scale.” This category included objective behavioral metrics like click-through rates on web banners, the amount of money donated to a campaign, or total view counts on video platforms.</p>
<p>The researchers then used linear mixed-effects models to test if brain activity in specific regions could predict both the individuals’ ratings and the large-scale behavioral outcomes. They focused their analysis on two primary neural systems: the reward system and the mentalizing system. The reward system is involved in anticipating value and pleasure, while the mentalizing system helps individuals understand the thoughts and feelings of others.</p>
<p>The statistical analysis revealed that activity in brain networks associated with reward processing was positively linked to message effectiveness. When participants showed higher engagement in the ventral tegmental area and nucleus accumbens, they were more likely to rate the messages as effective. These regions are deep structures in the brain that are typically involved in processing personal value and motivation. The study indicates that this neural signal of value is a consistent predictor of how well a message is received by the viewer.</p>
<p>The researchers also identified a strong connection between message success and activity in the brain’s mentalizing system. This network includes the medial prefrontal cortex and the temporal poles. These areas are active when people think about themselves or attempt to interpret the mental states of other people. The analysis showed that messages triggering this social processing network were more likely to be effective both for the person watching and for larger audiences.</p>
<p>A significant finding emerged when the researchers compared brain data to the real-world success of the messages at the population level. They found that neural activity in the mentalizing system predicted population-level outcomes, such as how often a video was shared. This predictive power held true even after accounting for the participants’ stated opinions in surveys. This suggests that the brain registers social relevance in ways that individuals may not consciously articulate.</p>
<p>The study refers to this phenomenon as “neuroforecasting.” This concept posits that neural activity in a small group of people can forecast the behavior of a much larger population. The findings support the idea that specific brain responses are more generalizable to the public than subjective self-reports. While people might say they like a message, their neural activity related to social processing appears to be a better indicator of whether that message will resonate with others.</p>
<p>“On average, the specific brain activity we tracked explained a small but robust portion of why messages were effective, roughly translating to what researchers call a small effect size (Cohen’s d = 0.22) at the population level,” the researchers told PsyPost. “We found this effect when looking at our large set of over 700 diverse messages as a whole. You could understand these neural markers as a ‘common currency’ that helps explain persuasion across many different real-world domains. However, the effect sizes also vary across message domains. Explaining that variance is an important task for the field going forward.”</p>
<p>“In a way, it is surprising that we were able to find any commonality in the neural processes related to message effectiveness across the messages we included. These messages did not only vary in their persuasive goals (from selling products, to recruiting volunteers, to promoting smoking cessation), but also in their format (videos, text, and more), and in the way their effectiveness was evaluated (click-through rates of online campaigns, self-report surveys, etc.).” </p>
<p>“This introduces a lot of noise in the analysis. Yet, we were still able to pick up on some common, underlying processes that support persuasion. This suggests that the ways in which we change our minds and behavior are, at least in part, similar across a variety of domains.”</p>
<p>Beyond the initial hypotheses regarding reward and social processing, an exploratory review of the whole brain uncovered additional patterns. Activity in regions linked to language processing and emotion also correlated with message success at scale. This implies that successful messages tend to engage the brain’s linguistic and emotional centers more deeply than less effective content. These exploratory findings suggest that emotion may play a larger role in mass-market success than previously identified in smaller studies.</p>
<p>“While we hypothesized that reward and social systems would be central, we were surprised to find through exploratory analysis that language processing and emotional brain responses also played significant roles in message success,” Scholz, Chan, and Falk said.</p>
<p>“Interestingly, our results suggested that neural signals related to emotion were particularly strong indicators of message effectiveness at scale—meaning for large groups—rather than just for individuals. We also found it notable that social processing activity in the brain provided ‘hidden&’ information about a message’s success that participants didn’t realize they were feeling or mention in their self-reports.”</p>
<p>As with all research, there are some limitations. Most of the data came from participants in Western, Educated, Industrial, Rich, and Democratic societies. Cultural norms heavily influence communication and social processing, so these neural markers might differ in other populations. The study is also correlational, meaning it observes associations but cannot prove that brain activity directly causes the messages to be effective.</p>
<p>Technical differences between the original studies also presented challenges for the analysis. The sixteen datasets used varied scanning parameters, equipment, and experimental protocols. While the mega-analysis approach helps smooth out some noise, these inconsistencies make it difficult to identify specific factors that might strengthen or weaken the observed effects. </p>
<p>“These neural markers should be seen as a first step toward experimental work,” the researchers noted. “We need more work, for instance, to interpret the exact psychological and thought processes that are responsible for creating the neural patterns we observed. A brain scanner is not a ‘mind-reading’ tool.”</p>
<p>Future work is needed to move from prediction to explanation. The researchers propose designing experiments that specifically manipulate message content to target the identified brain regions. Such studies could verify whether activating the reward or social processing systems intentionally leads to better outcomes. </p>
<p>“A major goal is to move from observing these brain patterns to conducting experiments that specifically design messages to activate these reward and social mechanisms to see if they become more effective,” Scholz, Chan, and Falk explained. “We also need to diversify our samples to include a broader range of global populations to ensure our findings apply to everyone. Finally, we hope to coordinate as a field to standardize how neuroimaging data is collected across different domains to make future large-scale collaborations even more powerful.”</p>
<p>“This project was a massive collaborative effort involving 16 functional MRI datasets, over 500 participants, and more than 700 unique messages. Because we believe in the importance of open science, we have made our data and analysis code publicly available so other researchers can build on these findings. We hope this study serves as a bridge between neuroscience, communication, and public policy to create more effective and beneficial messaging for society.”</p>
<p>The study, “<a href="https://doi.org/10.1093/pnasnexus/pgaf287" target="_blank">Brain activity explains message effectiveness: A mega-analysis of 16 neuroimaging studies</a>,” was authored by Christin Scholz, Hang-Yee Chan, Jeesung Ahn, Maarten A. S. Boksem, Nicole Cooper, Jason C. Coronel, Bruce P. Doré, Alexander Genevsky, Richard Huskey, Yoona Kang, Brian Knutson, Matthew D. Lieberman, Matthew Brook O’Donnell, Anthony Resnick, Ale Smidts, Vinod Venkatraman, Khoi Vo, René Weber, Carolyn Yoon, and Emily B. Falk.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-research-connects-the-size-of-the-beauty-market-to-male-parenting-effort/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New research connects the size of the beauty market to male parenting effort</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Feb 10th 2026, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research suggests that the size of a country’s cosmetics industry may be directly linked to how much fathers contribute to childcare and the level of economic inequality within that society. The findings propose that in cultures where men are active parents or where the gap between the rich and poor is wide, women are more likely to invest in their appearance to compete for partners. These results were published in the journal <em><a href="https://doi.org/10.1016/j.evolhumbehav.2025.106751" target="_blank">Evolution and Human Behavior</a></em>.</p>
<p>Charles Darwin originally proposed the theory of sexual selection to explain why males of many species possess exaggerated physical traits. He observed that peafowl are sexually dimorphic, meaning the males and females look different. The peacock displays a massive, colorful tail to attract a mate, while the peahen remains relatively plain.</p>
<p>This dynamic typically arises from the biological costs of reproduction. In most species, females expend more biological energy through the production of eggs, gestation, and lactation. Because their investment in each offspring is higher, females tend to be the choosier sex. Males must consequently compete with one another to be selected.</p>
<p>Humans, however, do not always fit neatly into this classical model. Human females often utilize conspicuous traits or cultural enhancements, such as makeup, to increase their attractiveness. Jun-Hong Kim, a researcher at the Pohang University of Science and Technology in the Republic of Korea, sought to explain this exception.</p>
<p>Kim aimed to determine if human mating follows a “revised” sexual selection theory. This framework suggests that the direction of mate choice depends on which partner contributes more resources to the relationship. If males provide substantial care and support, they become a limited and sought-after resource.</p>
<p>When men invest heavily in parenting, the cost of reproduction becomes high for them as well. The theory predicts that under these conditions, men will become more discriminating in their choice of partner. Consequently, women may compete for these high-investment males by enhancing their physical appearance.</p>
<p>The researcher also considered the role of economic environment. In societies with high economic inequality, a partner with resources can provide a substantial advantage in survival and reproductive success. This suggests that financial stratification might also intensify female competition for high-status mates.</p>
<p>To test these hypotheses, Kim conducted a cross-cultural analysis involving data from up to 55 countries. The study used the total financial size of the cosmetics industry in each nation as a proxy for female ornamentation and male choice. This data was sourced from Euromonitor, excluding baby products and men’s grooming items.</p>
<p>The researcher needed a way to measure how much fathers contribute to family life across different cultures. Kim utilized data from the OECD regarding the ratio of unpaid domestic work and childcare hours performed by women versus men. A lower ratio indicates that men are doing a larger share of the domestic work.</p>
<p>Economic inequality was measured using income inequality data from the CIA and a social mobility index from the World Economic Forum. These metrics helped determine how difficult it is to move between economic classes. The study also controlled for factors like urbanization and Gross Domestic Product per capita.</p>
<p>Kim’s analysis revealed a strong association between paternal effort and the beauty market. In countries where men performed a higher proportion of childcare and domestic labor, per capita spending on cosmetics was higher. This supports the idea that when men are active caregivers, they become “prizes” that warrant increased mating effort from women.</p>
<p>The study quantified this relationship with specific monetary figures. The data indicated that for every hour increase in paternal investment relative to maternal investment, per capita spending on cosmetics rose by roughly $2.17. This trend held true even when accounting for the general wealth of the nation.</p>
<p>Economic disparity also emerged as a strong predictor of beauty spending. The analysis showed that as income inequality and social mobility scores increased, so did the size of the cosmetics industry. This suggests that in stratified societies, women may invest more in their appearance to attract partners who can offer financial security.</p>
<p>The study posits that this behavior is a form of mutual mate choice. Unlike many mammals where one sex is clearly the chooser and the other is the competitor, humans appear to engage in a bidirectional assessment. Men evaluate potential partners based on cues of fitness and fertility, which cosmetics can highlight.</p>
<p>Kim also tested other variables that frequently appear in evolutionary psychology literature. One such variable was the operational sex ratio, which compares the number of marriageable men to women. Previous theories suggested that a surplus of women would lead to higher competition and beauty spending.</p>
<p>However, the results for sex ratio were not statistically significant in this model. The density of the population also failed to predict variations in cosmetics use. The primary drivers remained paternal investment and economic stratification.</p>
<p>The researcher checked for geographic clustering to ensure the results were not simply due to neighboring countries acting similarly. Visualizing the data on maps showed no distinct regional patterns that would skew the statistics. This suggests the link between parenting, economics, and cosmetics is not merely a byproduct of shared regional culture.</p>
<p>There are limitations to this type of cross-cultural research. The study relies on observational data, which can identify associations but cannot definitively prove causation. It is possible that other unmeasured cultural factors influence both how men parent and how women spend money.</p>
<p>The measurement of paternal investment was also restricted by data availability. Because the study relied on OECD time-use surveys, the analysis regarding childcare was limited to developed nations. This reduces the ability to generalize the findings to non-industrialized or developing societies.</p>
<p>Kim also notes that unpaid work hours are an imperfect proxy for total paternal investment. This metric does not capture the quality of care or the emotional support provided by fathers. It focuses strictly on the time spent on domestic tasks.</p>
<p>Future research could address these gaps by using more direct measures of parenting effort. Kim suggests that standardized surveys across a wider range of cultures could provide granular detail on how fathers contribute. This would allow for a more robust test of the revised sexual selection theory.</p>
<p>The study provides a new lens through which to view the multi-billion dollar beauty industry. Rather than seeing cosmetics solely as a product of modern marketing, the research frames them as tools in an ancient biological strategy. It highlights how economic structures and family dynamics shape human behavior.</p>
<p>This perspective challenges the stereotype that sexual selection is always male-driven. It underscores that in humans, the high cost of raising children makes distinct demands on both parents. When men step up as caregivers, the dynamics of attraction and competition appear to shift in measurable ways.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.evolhumbehav.2025.106751" target="_blank">Paternal investment and economic inequality predict cross-cultural variation in male choice</a>,” was authored by Jun-Hong Kim.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>