<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/how-you-bet-after-a-win-may-depend-on-your-personality-and-intelligence/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">How you bet after a win may depend on your personality and intelligence</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 20th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study analyzing real-world betting behavior reveals that individual psychological traits can predict how people handle their money following a win. Researchers found that after a winning day, bettors tend to wager more money and play again sooner, but this tendency is influenced by their intelligence, conscientiousness, and extraversion. The research was published in the <em><a href="https://doi.org/10.1016/j.jrp.2025.104669" target="_blank" rel="noopener">Journal of Research in Personality</a></em>.</p>
<p>Many financial choices are guided by mental shortcuts and emotional impulses rather than by pure logic. One well-documented pattern is the “house money effect,” where a person who has recently won money becomes more willing to take risks. The phenomenon gets its name from the casino environment, where gamblers may feel they are playing with the “house’s” money, not their own, making potential losses feel less significant.</p>
<p>While this effect has been observed in laboratory settings, its operation in the complex environment of real-life financial decisions is less understood. A team of researchers from several Finnish institutions, including the Finnish Institute for Health and Welfare and the University of Turku, sought to investigate this bias using a uniquely comprehensive dataset. They wanted to see if the house money effect appears in actual betting patterns and, more specifically, if stable individual traits like intelligence and personality could moderate a person’s susceptibility to it.</p>
<p>To conduct their analysis, the researchers combined information from three large-scale, independent sources. The first was a year’s worth of anonymized online horse betting data from Finland’s state-owned betting agency, covering every wager made by thousands of individuals. The second was a national administrative registry from Statistics Finland, which provided a range of socioeconomic details for the same people. The third source was data from the Finnish Defence Forces, which administers cognitive ability and personality tests to all male conscripts.</p>
<p>By linking these datasets, the team created a final sample of 11,220 men between the ages of 36 and 54. This approach allowed them to connect objective, real-world betting behavior to psychological traits measured years, or even decades, earlier. The researchers defined the house money effect in two ways: an increase in the amount of money wagered on a given day and a shorter time between betting sessions. They then used statistical models to see if having won on the previous betting day predicted these behaviors.</p>
<p>The analysis first confirmed the presence of the house money effect within this large group of bettors. Individuals who had a net gain on their previous day of betting tended to wager more money on their next session. On average, a win was associated with a 45 percent increase in the amount wagered the next time they played. These bettors also returned to place new bets more quickly, with a prior win shortening the time until the next session by an average of about 16 percent.</p>
<p>The central part of the investigation involved examining how this pattern changed based on individual traits. The models showed that intelligence played a moderating role. For individuals with higher IQ scores, the house money effect was weaker. While they still bet more after a win, the increase was less pronounced compared to those with lower IQ scores. A one standard deviation increase in IQ was associated with a nearly 4 percent reduction in the strength of the effect on bet size.</p>
<p>A similar relationship was found for the personality trait of conscientiousness, which relates to diligence, self-control, and dutifulness. Individuals who scored higher in conscientiousness were less susceptible to the house money effect. Like their high-IQ counterparts, they showed a smaller increase in their betting amounts and a less pronounced hastening of their return to betting after a win.</p>
<p>The personality trait of extraversion showed the opposite relationship. For individuals who scored higher in extraversion, a trait associated with sociability, assertiveness, and reward sensitivity, the house money effect was stronger. These bettors reacted more intensely to a prior day’s win, increasing their wagers by a larger margin than their more introverted peers. A one standard deviation increase in extraversion was linked to a more than 4 percent increase in the strength of the effect on bet size.</p>
<p>The researchers note some limitations to their study. The data on personality and intelligence were collected when the participants were young men serving as conscripts, a significant amount of time before their betting behavior was recorded. The fact that these decades-old measures still predicted behavior speaks to the stability of these traits, but the time gap could have diluted the observed effects. Also, because the psychological data came from the military, the primary sample consisted only of men within a specific age range, so the findings may not generalize to women or to men of other ages.</p>
<p>Future research could explore whether these patterns extend beyond gambling into other domains of decision-making. For example, a successful purchase or investment might create a similar feeling of being “ahead,” potentially influencing subsequent consumer or financial behavior. The findings suggest that psychological traits are not just abstract descriptors but have tangible connections to how people navigate financial risks and rewards in their daily lives. By using large, real-world datasets, this work provides a clearer picture of the interplay between our mind and our money.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.jrp.2025.104669" target="_blank" rel="noopener">Intelligence, conscientiousness and extraversion moderate the house money effect in real-life financial decision-making</a>,” was authored by Jussi Palomäki, Michael Laakasuo, Sari Castrén, Tuomo Kainulainen, Jani Saastamoinen, and Niko Suhonen.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/functional-imbalance-of-two-brain-networks-might-predict-cognitive-decline-in-alzheimers-disease/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Functional imbalance of two brain networks might predict cognitive decline in Alzheimer’s disease</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 22:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A neuroimaging study of individuals spanning the Alzheimer’s disease spectrum found that the attenuated anticorrelation (reduced negative association) between the activity of the default mode network and the dorsal attention network of the brain predicted cognitive decline. The association between this anticorrelation and cognitive decline did not depend on education level, a known protective factor against cognitive decline. The research was published in <a href="https://doi.org/10.1016/j.neuroimage.2025.121509"><em>NeuroImage</em></a>.</p>
<p>Alzheimer’s disease is a progressive neurodegenerative disorder that primarily affects memory, thinking, and behavior. It is the most common cause of dementia, accounting for up to 70% of all dementia cases. The disease is characterized by the accumulation of abnormal proteins in the brain—beta-amyloid plaques and tau tangles—that disrupt communication between neurons.</p>
<p>Over time, these accumulations lead to the death of brain cells, causing shrinkage of key regions involved in learning and memory, such as the hippocampus. One of the earliest symptoms is mild forgetfulness that gradually worsens into severe memory impairment. Cognitive decline in Alzheimer’s affects not only memory but also reasoning, problem-solving, language, and spatial orientation.</p>
<p>As the disease progresses, individuals become disoriented, have difficulty recognizing familiar people or places, and struggle with everyday tasks. Emotional and behavioral changes such as apathy, irritability, or anxiety often accompany cognitive deterioration. While age is the strongest risk factor, genetics, cardiovascular health, and lifestyle also play important roles.</p>
<p>Study author Diego-Martin Lombardo and his colleagues note that the accumulation of abnormal proteins starts many years before the first cognitive symptoms appear. This means that Alzheimer’s disease has a long period of development with no symptoms. Because of this, it is important to develop ways to detect the development of Alzheimer’s disease in the early stages, while cognitive symptoms have not yet appeared.</p>
<p>The study authors hypothesized that one such early indicator of Alzheimer’s disease might be the anticorrelation between two important brain networks: the default mode network and the dorsal attention network. The default mode network, or DMN, is a brain network active during rest, daydreaming, and while a person is thinking about oneself or introspecting.</p>
<p>The dorsal attention network, or DAN, is a brain network engaged during goal-directed attention, focusing on external stimuli, and controlling voluntary attention. Typically, the activity of these two networks is anticorrelated, meaning that when the activity of one of them increases, the activity of the other decreases, and vice versa.</p>
<p>The researchers analyzed functional magnetic resonance imaging (fMRI) data of 182 individuals from one of the datasets belonging to the Alzheimer’s Disease Neuroimaging Initiative, or ADNI. ADNI is a large, long-term research project that collects and shares clinical, imaging, genetic, and biomarker data to study the progression of Alzheimer’s disease and improve its early diagnosis and treatment. The participants’ average age was approximately 70 years, and 59% of them were men.</p>
<p>The authors classified participants into four groups based on whether they showed cognitive impairment and whether abnormally high levels of beta-amyloid protein deposits were detected in their brains. This differentiation allowed researchers to distinguish between different stages of cognitive decline and Alzheimer’s disease progression.</p>
<p>Results showed that anticorrelation between DMN and DAN was weaker in the group that both had cognitive impairment and abnormal levels of amyloid plaque in their brains. The combination of high concentrations of these abnormal proteins and cognitive impairment indicates that these individuals are at a high risk of Alzheimer’s disease or that it has already been developing for some time.</p>
<p>Further analyses indicated that reduced DMN-DAN anticorrelation predicted cognitive decline even after taking into account sex, age, education, and the concentrations of tau proteins (another type of abnormal protein with high concentrations in the brains of individuals with Alzheimer’s disease).<br>
Most interestingly, this association between DMN-DAN anticorrelation and cognitive decline did not depend on education level. Education level is generally an indicator of cognitive reserve—the brain’s ability to modify its pathways to cope with damage and age-related changes. Cognitive reserve is widely believed to be a protective factor against age-related cognitive decline.</p>
<p>“We demonstrate that the attenuation of the anticorrelation between DMN and DAN is associated with a mechanism of cognitive dysfunction independent of tau pathology and proxies of resilience to cognitive decline or cognitive reserve. Our results also suggest the existence of an alternative mechanism of neurocognitive breakdown independent of advanced medial temporal cortex pathology and protective factors of cognitive decline, such as cognitive reserve,” study authors concluded.</p>
<p>The study contributes to the scientific understanding of the neural bases of Alzheimer’s disease. However, the design of the study does not allow any causal inferences to be derived from the results. Therefore, it remains unknown whether the reduced DMN-DAN anticorrelation is one of the causes of cognitive decline seen in Alzheimer’s disease or is a consequence of this decline.</p>
<p>The paper, “<a href="https://doi.org/10.1016/j.neuroimage.2025.121509">The intrinsic connectivity between the Default Mode and Dorsal Attention networks is an independent fMRI biomarker of Alzheimer’s disease pathology burden,</a>” was authored by Diego-Martin Lombardo, Christian F. Beckmann, and the Alzheimer’s Disease Neuroimaging Initiative.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/beyond-transactions-what-new-psychology-research-reveals-about-true-friendship/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Beyond transactions: What new psychology research reveals about true friendship</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Despite how natural friendship can feel, people rarely stop to analyze it. How do you know when someone will make a good friend? When is it time to move on from a friendship? Oftentimes, people rely on gut intuitions to answer these kinds of questions.</p>
<p>In psychology research, there’s no universally accepted definition of a friend. Traditionally, when psychologists have analyzed friendship, it’s often been through the lens of exchange. How much did that friend do for me? How much did I do for them? The idea is that friendships are transactional, where friends stick around only as long as they are getting at least as much as they are giving in the friendship.</p>
<p>But this focus doesn’t capture what feels like the essence of friendship for many people. <a href="https://scholar.google.com/citations?user=YBPxHqkAAAAJ&hl=en&oi=ao">We</a> <a href="https://scholar.google.com/citations?user=8abR970AAAAJ&hl=en">and</a> our colleagues think another model for relationships – what we call risk-pooling – better matches what many people experience. In this kind of friendship, no one is keeping track of who did what for whom.</p>
<p>Our research over the past decade suggests that this kind of friendship was essential for our ancient ancestors to survive the challenges they encountered. And we feel it’s essential for surviving the challenges of life today, whether navigating personal struggles or dealing with natural disasters.</p>
<h2>A focus on what friends give you</h2>
<p>The traditional <a href="https://doi.org/10.1007/978-94-007-6772-0_3">social exchange theory of friendship</a> views relationships as transactions where people keep a tally of costs and benefits. Building on this framework, researchers have suggested that you approach each friendship with a running list of pluses and minuses to decide whether to maintain the bond. You keep friendships that provide more benefits than costs, and you end those that don’t.</p>
<p>The theory holds that this balancing act comes into play when making decisions about <a href="https://doi.org/10.1111/j.1467-9507.2008.00499.x">what kinds of friendships to pursue</a> and <a href="https://doi.org/10.1287/isre.2017.0737">how to treat your friends</a>. It’s even <a href="https://www.calm.com/blog/social-exchange-theory">made its way into pop psychology</a> self-help spaces.</p>
<p>We contend that the biggest issue with social exchange theory is that it misses the nuances of real-life relationships. Frankly, the theory’s wrong: People often don’t use this cost-to-benefit ratio in their friendships.</p>
<h2>Less accounting, more supporting</h2>
<p>Anybody who has seen a friend through tough times – or been the one who was supported – can tell you that keeping track of what a friend does for you isn’t what friendships are about. Friendships are more about companionship, enjoyment and bonding. Sometimes, friendship is about helping just because your friend is in need and you care about their well-being.</p>
<p>Social exchange theory would suggest that you’d be better off dropping someone who is going through cancer treatment or a death in the family because they’re not providing as many benefits to you as they could. But real-life experiences with these situations suggest the opposite: These are the times when many people are most <a href="https://doi.org/10.1177/0269216308098798">likely to support their friends</a>.</p>
<p>Our research is consistent with this intuition about the shortcomings of social exchange theory. When we surveyed people about <a href="https://doi.org/10.1177/02654075241243341">what they want in a friend</a>, they didn’t place a high value on having a friend who is conscientious about paying back any debts – something highly valued from a social exchange perspective.</p>
<p>People considered other traits – such as loyalty, reliability, respectfulness and being there in times of need – to be much more important. These qualities that relate to emotional commitment were seen as necessities, while paying back was seen as a luxury that mattered only once the emotional commitment was met.</p>
<p>Having friends who will help you when you’re struggling, work with you in the friendship and provide emotional support all <a href="https://doi.org/10.1016/j.paid.2023.112120">ranked higher in importance</a> than having a friend who pays you back. While they might not always be able to provide tangible benefits, friends can show they care in many other ways.</p>
<p>Of course, friendship <a href="https://doi.org/10.1023/A:1015227414929">isn’t always positive</a>. Some friends can take advantage by asking too much or neglecting responsibilities they could handle themselves. In those cases, it can be useful to step back and weigh the costs and benefits.</p>
<h2>Friendship is more than the sum of its parts</h2>
<p>But how do friendships actually help people survive? That is one question that we investigated as part of <a href="https://www.humangenerosity.org/">The Human Generosity Project</a>, a cross-disciplinary research collaboration.</p>
<p>The risk-pooling rather than exchange pattern of friendship is something that we found <a href="https://doi.org/10.1007/978-3-030-15800-2_4">across societies</a>, from “<a href="https://www.jstor.org/stable/646235">kere kere” in Fiji</a> to “<a href="https://doi.org/10.1017/ehs.2020.22">tomor marang” among the Ik in Uganda</a>. People help their friends in times of need without expecting to be paid back.</p>
<p>The Maasai, an Indigenous group in Kenya and Tanzania who rely on cattle herds to make their living, cultivate friends who help them when they are in need, with no expectation about <a href="https://doi.org/10.1007/s10745-010-9364-9">paying each other back</a>. People ask for help from these special friends, called osotua partners, only when they are in genuine need, and they give if they are asked and able.</p>
<p>These partnerships are not about everyday favors – rather, they are about surviving unpredictable, life-altering risks. Osotua relationships are built over a lifetime, passed down across generations and often marked with sacred rituals.</p>
<p>When we modeled how these osotua relationships function over time, we found they help people survive when their <a href="https://doi.org/10.1007/s10745-010-9364-9">environments are volatile</a> and when they ask those most <a href="https://doi.org/10.1016/j.evolhumbehav.2014.12.003">likely to be able to help</a>. These relationships lead to higher rates of survival for both partners compared to those built on <a href="https://doi.org/10.1007/s10745-010-9364-9">keeping track of debts</a>.</p>
<p>These friends act as social insurance systems for each other, helping each other when needs arise because of unpredictable and uncontrollable events.</p>
<p>And we see this in the United States, just as we do in smaller-scale, more remote societies. In one study, we focused on ranchers in southern Arizona and New Mexico embedded in a network of what they call “neighboring.” They don’t expect to be paid back when they <a href="https://doi.org/10.1007/s12110-021-09406-8">help their neighbors with unpredictable challenges</a> such as an accident, injury or illness. We also found <a href="https://doi.org/10.1016/j.cresp.2023.100095">this same pattern</a> in an online study of U.S.-based participants.</p>
<p>In contrast, people such as the ranchers we studied are more likely to expect to be paid back for help when needs arise because of more predictable challenges such as <a href="https://doi.org/10.1007/s12110-021-09406-8">branding cattle</a> or <a href="https://doi.org/10.1016/j.cresp.2023.100095">paying bills</a>.</p>
<h2>Catastrophic insurance, not tit for tat</h2>
<p>What all this research suggests is that friendship is less about the exchange of favors and more about being there for each other when unforeseeable disaster strikes. Friendship seems more like an insurance plan designed to kick in when you need it most rather than a system of balanced exchange.</p>
<p>What lets these partnerships endure is not only generosity, but also restraint and responsibility: Maasai expect their osotua partners to take care of themselves when they can and to ask only when help is truly needed. That balance of care, respect and self-management offers a useful model.</p>
<p>In a world of growing uncertainty, cultivating risk-pooling friendships and striving to be a good partner yourself may help you build resilience. Our ancestors survived with the help of this kind of relationship; our future may depend on them too.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/258549/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/friendships-arent-just-about-keeping-score-new-psychology-research-looks-at-why-we-help-our-friends-when-they-need-it-258549">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/researchers-uncover-complex-genetic-ties-between-adhd-and-morning-cortisol/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Researchers uncover complex genetic ties between ADHD and morning cortisol</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study suggests the connection between Attention-Deficit/Hyperactivity Disorder (ADHD) and the body’s primary stress hormone, cortisol, is not one of direct cause and effect. Instead, the two appear to share a complex genetic foundation, pointing toward common biological pathways that regulate arousal and behavior. The research, published in the journal <em><a href="https://doi.org/10.1016/j.psyneuen.2025.107587" target="_blank">Psychoneuroendocrinology</a></em>, helps refine scientific understanding of the physiological underpinnings of ADHD.</p>
<p>Attention-Deficit/Hyperactivity Disorder is a common neurodevelopmental condition characterized by persistent patterns of inattention, impulsivity, and hyperactivity. For some time, researchers have investigated its relationship with the body’s stress response system, formally known as the hypothalamic-pituitary-adrenal axis. This system’s main product is cortisol, a hormone that follows a natural daily rhythm, typically peaking in the morning to promote wakefulness and declining throughout the day.</p>
<p>Because this system also governs general arousal levels, which are thought to be regulated differently in individuals with ADHD, scientists have explored whether cortisol levels might be linked to the condition. Some previous analyses of multiple studies indicated that people with ADHD, particularly youths, tend to have lower baseline levels of morning cortisol. </p>
<p>However, results across individual studies have been inconsistent, leaving the nature of this relationship ambiguous. It was not clear if cortisol differences contributed to ADHD, if ADHD itself altered cortisol regulation, or if another factor linked them both.</p>
<p>To investigate this question, a team of researchers from institutions in Brazil and Denmark turned to genetics. Led by scientists at the University of São Paulo and the Federal University of Rio Grande do Sul, the group used large-scale genetic data to dissect the biological connection between ADHD and morning cortisol levels. Their approach was designed to move beyond simple correlation and test for causal links and shared genetic architecture.</p>
<p>The researchers first employed a powerful statistical method known as Mendelian Randomization. This technique uses genetic variants as a proxy for an exposure, in this case, morning cortisol levels. Because a person’s genes are randomly assigned at conception and generally remain fixed, this method can help determine if an exposure causes an outcome without the confounding influence of environmental or lifestyle factors. The analysis drew on genetic data from two massive datasets: one for ADHD that included over 225,000 people and one for morning cortisol levels from over 25,000 people.</p>
<p>After applying seven different Mendelian Randomization models, the team found no evidence of a causal relationship in either direction. Genetically predicted morning cortisol levels did not appear to increase the likelihood of having ADHD. Similarly, a genetic predisposition for ADHD did not seem to cause alterations in morning cortisol levels. This result suggested that a simple, direct causal link was unlikely and that a more complex connection might exist.</p>
<p>Next, the team searched for evidence of pleiotropy, a phenomenon where the same genes influence multiple, seemingly unrelated traits. They started by conducting a global genetic correlation analysis, which scans the entire genome to see if the genetic factors for ADHD and cortisol overlap on a broad scale. This analysis did not find a significant overall correlation, indicating that on a genome-wide level, the two traits do not share substantial genetic influences.</p>
<p>The investigation then shifted to a more fine-grained approach. Using a technique called Local Analysis of Variant Association, the researchers zoomed in on 2,495 specific segments of the genome to search for localized regions of genetic overlap. Here, they made a key discovery. They identified two distinct genomic regions, one on chromosome 5 and another on chromosome 22, where the genetics of ADHD and morning cortisol were significantly linked.</p>
<p>The nature of these links was bidirectional. The region on chromosome 5 showed a negative correlation, meaning that genetic variants associated with a higher likelihood of ADHD were also associated with lower morning cortisol levels. </p>
<p>In contrast, the region on chromosome 22 showed a positive correlation, where genetic variants linked to a higher likelihood of ADHD were tied to higher morning cortisol levels. This mixture of positive and negative associations in different parts of the genome helps explain why the global analysis found no overall effect.</p>
<p>The genes located within these two regions, such as RASGRF2 and TRIOBP, have previously been implicated in a range of psychiatric conditions, cognitive functions, and risk-taking behaviors. An additional analysis designed to pinpoint single genetic markers associated with both traits identified one variant located within a gene called ZNF652-AS1. This marker has also been connected in other research to behaviors related to impulsivity.</p>
<p>To see how these genetic predispositions play out in people, the researchers conducted a final analysis in an independent group of 1,660 Brazilian adults, some with ADHD and some without. For each person, they calculated a polygenic score, which summarizes an individual’s genetic tendency toward higher or lower morning cortisol. The team then tested if this score was associated with having an ADHD diagnosis or other co-occurring psychiatric conditions.</p>
<p>The cortisol polygenic score was not directly associated with having an ADHD diagnosis on its own. It was, however, associated with the presence of other conditions known as externalizing disorders. A higher genetic predisposition for morning cortisol was linked to a greater likelihood of having a substance use disorder, oppositional defiant disorder, or antisocial personality disorder. </p>
<p>When the researchers statistically accounted for the presence of these co-occurring conditions, a new pattern emerged: a lower cortisol polygenic score was then associated with ADHD. This finding suggests the link between cortisol genetics and ADHD may be partly masked or mediated by related behavioral conditions.</p>
<p>Taken together, the study’s results argue against a direct causal pathway between cortisol and ADHD. Instead, they support a model of localized pleiotropy, where specific sets of genes influence both the body’s arousal systems and the behaviors characteristic of ADHD. The researchers propose that these findings align with an “inverted U-shaped” model of physiological regulation. </p>
<p>In this view, optimal functioning occurs at a moderate level of arousal, and deviations in either direction, resulting in either too low or too high cortisol, could be associated with ADHD-related traits. This perspective reframes ADHD not just as a disorder of attention, but as a condition involving broader systemic dysregulation.</p>
<p>The study has some limitations. The large genetic datasets used for the initial analyses were drawn predominantly from individuals of European ancestry, which may limit the generalizability of the findings to other populations. The polygenic score analysis in the Brazilian sample highlighted this, as the associations were primarily detected in the subgroup of participants with greater European ancestry. The clinical sample for the polygenic score analysis was also modest in size.</p>
<p>Future research could build on these findings by examining these genetic links in larger and more diverse populations. Integrating genetic data with direct, repeated measurements of cortisol levels in individuals over time would also help to clarify how these shared genetic factors influence the body’s hormonal dynamics from day to day. Such work would continue to illuminate the complex biological interplay that contributes to ADHD.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.psyneuen.2025.107587" target="_blank">Shared biological pathways linking ADHD and cortisol variability are related to externalizing behaviors</a>,” was authored by João K.N. Ramos, Eugenio H. Grevet, Iago Junger-Santos, Nicolas P. Ciochetti, Cibele E. Bandeira, Maria E. de Araujo Tavares, Victor F. de Oliveira, Eduardo S. Vitola, Luis A. Rohde, Rodrigo Grassi-Oliveira, Bruna S. da Silva, Claiton H. Dotto Bau, and Diego L. Rovaris.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/toxic-masculinity-indirectly-lowers-help-seeking-behavior-by-encouraging-men-to-bottle-up-emotions/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Toxic masculinity indirectly lowers help-seeking behavior by encouraging men to bottle up emotions</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in the journal <em><a href="https://doi.org/10.1016/j.paid.2025.113459" target="_blank" rel="noopener">Personality and Individual Differences</a></em> provides evidence that toxic masculinity may contribute to men’s reluctance to express emotions, which in turn reduces their likelihood of seeking mental health support. The research suggests that while toxic masculinity does not directly predict whether men seek help for emotional or suicidal problems, it is linked to a pattern of emotional restriction that does. The findings may help inform efforts to improve mental health service access for men, who remain underrepresented in treatment settings despite elevated suicide risks.</p>
<p>Toxic masculinity is a term used to describe a set of cultural beliefs that exaggerate traditional male gender norms, particularly those that emphasize toughness, emotional suppression, dominance over others, and a rejection of anything perceived as feminine. Unlike general masculinity, which can include neutral or even positive traits such as independence or responsibility, toxic masculinity refers to traits that can perpetuate harm.</p>
<p>The research was conducted by Eva Horton, Nathaniel Schermerhorn, and Paul Hanel at the University of Essex and Essex Partnership University Trust. Their study aimed to clarify how specific attitudes associated with toxic masculinity—such as aggression, dominance, and avoidance of emotional vulnerability—contribute to barriers in men’s mental health help-seeking behavior.</p>
<p>The researchers set out to explore whether toxic masculinity fosters emotional restriction among men, and whether this restriction is what prevents many from seeking professional help for emotional distress. Their motivation stemmed from ongoing public health concerns in the United Kingdom, where suicide rates among men are significantly higher than among women and mental health services remain underutilized by male populations.</p>
<p>“This study was motivated by an underrepresentation of ‘toxic masculinity’ in academia, which did not reflect the significant use of the term in the media. The research additionally addresses an ongoing real-world problem, of higher death by suicides in men and lower access to mental health services in male populations,” the researchers told PsyPost.</p>
<p>The research included two studies. The first involved 220 participants—66 men, 152 women, and 2 individuals identifying as other genders—ranging in age from 18 to 83. Participants completed a battery of psychological scales measuring their endorsement of toxic masculinity and related traits such as aggression, dominance, and avoidance of femininity. The researchers also measured “restrictive emotionality,” or the belief that men should not express emotional vulnerability.</p>
<p>Toxic masculinity was assessed using specific subscales from established questionnaires that focused on beliefs about dominance over women, winning at all costs, and the rejection of non-heterosexual identities. Restrictive emotionality was measured with items like “A man should never admit when others hurt his feelings.” The survey also included measures of hostile sexism, traditional values, self-reliance, and emotional control.</p>
<p>Study 2 focused specifically on men, recruiting 264 adult male participants from across the United Kingdom. These men completed similar measures and were also asked about their likelihood of seeking help for personal emotional problems or suicidal thoughts. This allowed the researchers to examine both direct and indirect associations between toxic masculinity, emotional restriction, and help-seeking behavior.</p>
<p>In both studies, toxic masculinity was strongly associated with restrictive emotionality. Men who endorsed traditional masculine norms centered on dominance and aggression were more likely to believe that showing emotion is a sign of weakness. These individuals also tended to report high levels of self-reliance and emotional control, traits that are often socially encouraged in men but can be detrimental to psychological wellbeing when taken to extremes.</p>
<p>However, when it came to predicting whether someone would seek help for emotional difficulties or suicidal ideation, toxic masculinity was not directly related to help-seeking intentions. Instead, the researchers found that restrictive emotionality was the stronger predictor. In Study 2, men who believed they should conceal their emotions were significantly less likely to say they would seek help from mental health professionals, friends, or intimate partners.</p>
<p>Mediation analyses suggested that toxic masculinity contributes to reduced help-seeking by increasing emotional restriction. In other words, the cultural scripts tied to toxic masculinity may not directly discourage seeking help but foster beliefs about emotional toughness that do.</p>
<p>“High toxic masculinity, a term which highlights traits of aggression and dominance, revealed higher likelihood to restrict external emotional displays (restricted emotionality),” the researchers explained. “In turn this restricted emotionality showed decreased likelihood of accessing mental health support. Whilst toxic masculinity and mental health seeking support was not directly related, one can still hold these findings as important considerations when discussing mental health in male populations.”</p>
<p>“The associations between toxic masculinity and restricted emotionality were substantial in both of our studies, suggesting that many – but not all – men who score high in toxic masculinity are also more likely to mask any signs of feeling anxious or rejected, for example.”</p>
<p>The researchers also examined whether participants would turn to specific sources of support. Interestingly, men high in toxic masculinity were less likely to seek help from friends or romantic partners but were more willing to turn to religious leaders. This could reflect a belief that certain contexts or authority figures offer socially acceptable outlets for emotional vulnerability, particularly within traditional frameworks.</p>
<p>As with all research, there are limitations to consider. The research was cross-sectional, meaning that it captured correlations between variables at one point in time and cannot determine cause-and-effect relationships. Longitudinal studies will be necessary to establish whether toxic masculinity and emotional restriction actively lead to decreased help-seeking behavior over time.</p>
<p>Another limitation is cultural. Because the research was conducted in the United Kingdom, its findings may not fully generalize to other cultural contexts. Social norms around masculinity vary widely across countries and communities, and what constitutes “toxic” traits in one society may be interpreted differently elsewhere.</p>
<p>The findings have practical relevance for those designing mental health outreach and services for men. Rather than framing emotional vulnerability as a departure from masculinity, programs might aim to reframe emotional openness as a strength. Interventions could also be designed to specifically address restrictive emotionality, which appears to be a more immediate barrier to help-seeking than toxic masculinity itself.</p>
<p>The study’s authors caution that not all men who endorse masculine norms are at risk. Their data show substantial variability in the extent to which men adhere to these beliefs. Still, the consistent association between toxic masculinity and emotional restriction suggests that changing public discourse around masculinity could have long-term benefits for men’s mental health.</p>
<p>“Our goals for the research is to aid in the study of men’s mental health seeking support to highlight potential needs for adaptations in mental health education and treatment in men,” the researchers said.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.paid.2025.113459" target="_blank" rel="noopener">The impact of toxic masculinity on restrictive emotionality and mental health seeking support</a>,” was authored by Eva Horton, Nathaniel Schermerhorn, and Paul Hanel.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/feeling-grateful-fosters-cooperation-by-synchronizing-brain-activity-between-partners/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Feeling grateful fosters cooperation by synchronizing brain activity between partners</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study reports that the emotion of gratitude can foster widespread cooperation between individuals, even when it involves a personal cost. The research also reveals that this enhanced cooperation is linked to increased synchronization of brain activity between partners, an effect that appears to grow over time. These results, published in <em><a href="https://doi.org/10.1093/scan/nsaf023" target="_blank">Social Cognitive and Affective Neuroscience</a></em>, help explain the powerful social function of gratitude in human relationships.</p>
<p>Gratitude is often understood as a positive feeling that arises when one benefits from the kindness of another person. Beyond this personal experience, many researchers consider it an emotion that evolved to help people manage social relationships and work together effectively. It is thought to motivate individuals to reciprocate kind actions and navigate the challenges of cooperation. </p>
<p>A team of researchers from several institutions in Shanghai, China, led by Yangzhuo Li, Junlong Luo, and Xianchun Li, sought to expand on this idea. They noted that past research often examined gratitude in the context of simple, one-time exchanges. The team wanted to explore whether gratitude’s influence extends to different kinds of cooperation and how it might shape interactions as they unfold over time.</p>
<p>To do this, they designed experiments to test cooperation in two distinct scenarios. One scenario involved a “costly” decision, where cooperating with a partner meant giving up a chance for a larger personal gain. The other was a “costless” scenario that depended on precise behavioral coordination, where success was a shared outcome with no temptation to betray the other person. </p>
<p>To observe the neural underpinnings of these interactions, the researchers used a technology called functional near-infrared spectroscopy. This non-invasive method involves wearing a cap with sensors to measure blood flow changes in the brain, allowing the team to track brain activity in both participants simultaneously and look for patterns of inter-brain synchronization, a state where two individuals’ brain activity becomes coupled during an interaction.</p>
<p>The study recruited 93 pairs of female university students who were strangers to one another. Using only female participants helped control for potential gender-based differences in cooperation and emotional expression that have been documented in other studies. The pairs were randomly assigned to one of three groups: gratitude, joy, or neutral. Each pair first participated in an activity designed to induce the target emotion. </p>
<p>In this task, participants allocated a small amount of money over two rounds. In the gratitude group, a participant was led to believe their partner had generously given them a larger share in the second round, accompanied by a kind message. In the joy group, a computer program provided the same monetary benefit, creating a positive feeling without attributing it to the partner. In the neutral group, the computer allocated the money evenly.</p>
<p>After the emotion induction, the pairs played two different cooperation games while their brain activity was recorded. The first game was a version of the Prisoner’s Dilemma, which models costly cooperation. In each of 30 trials, split into two blocks, participants had to privately choose to either “cooperate” with or “defect” against their partner. Cooperating was best for the pair jointly, but defecting offered a larger individual payoff if the partner cooperated. </p>
<p>After the first block of trials, participants received feedback that was manipulated to show that their partner had defected slightly more often than they had. This created a test of whether grateful participants would be more forgiving of a partner’s apparent selfishness.</p>
<p>The behavioral results from this game showed a clear effect of gratitude. Overall, pairs in the gratitude group achieved mutual cooperation more often and had fewer instances of mutual defection compared to the joy and neutral groups. The dynamic nature of this effect became apparent after the manipulated feedback was given. In the second block of trials, participants in the joy and neutral groups became less cooperative. </p>
<p>In contrast, the cooperation rate in the gratitude group remained stable. This suggests that the feeling of gratitude made participants more resilient to their partner’s slight defection and more willing to continue a cooperative relationship. Psychological questionnaires confirmed that participants in the gratitude group reported higher levels of gratitude and trust toward their partner, feelings that were correlated with higher rates of mutual cooperation.</p>
<p>The second game was a button-press task designed to measure costless cooperation through action coordination. In this game, the two participants had to press a button at the exact same time to score points together. Success depended on their ability to synchronize their actions based on feedback from previous trials. </p>
<p>Here again, gratitude appeared to promote better performance. The gratitude group showed a higher rate of effective adjustment, meaning they were more successful at using feedback to coordinate their timing with their partner. Their overall success rate at winning points also improved significantly from the first block of trials to the second. While the joy group also showed improvement over time, the gratitude group’s performance was generally higher, especially in the second block.</p>
<p>The brain imaging data offered a potential explanation for these behavioral patterns. During the Prisoner’s Dilemma game, the gratitude group showed higher inter-brain synchronization than the joy group in a brain region known as the right middle frontal gyrus, particularly during the second block of trials after the negative feedback. This region is associated with understanding others’ perspectives and adapting one’s behavior in social contexts. The effect was even more pronounced during the button-press coordination game. </p>
<p>The gratitude group exhibited significantly higher brain synchronization in several areas, including the left and right middle frontal gyrus and the right sensorimotor cortex, a region involved in planning and executing physical actions.</p>
<p>A dynamic pattern also emerged in the brain data. For the gratitude group playing the coordination game, inter-brain synchronization in several key areas, including the frontal and temporal regions of the brain, increased from the first block to the second. This progressive neural coupling mirrored their improving behavioral performance. </p>
<p>This connection between brain and behavior was also evident in the finding that higher synchronization in the right middle frontal gyrus was associated with a greater rate of effective adjustment in the button-press game. The brain synchronization was confirmed to be a result of the live interaction, as it was significantly higher in real pairs compared to randomly generated “pseudo-pairs” of participants who did not interact.</p>
<p>The researchers note some limitations of their work. The study involved only young female adults, so the findings may not apply to men, mixed-gender pairs, or people of different ages. The experiments were also conducted using computer-based tasks in a controlled laboratory setting, which may not fully capture the complexity of cooperation in real-world scenarios. </p>
<p>Future research could explore these dynamics in different demographic groups and more naturalistic settings. It could also examine the opposite of gratitude to understand how ingratitude might disrupt cooperative relationships.</p>
<p>The study, “<a href="https://doi.org/10.1093/scan/nsaf023" target="_blank">Gratitude enhances widespread dynamic cooperation and inter-brain synchronization in females</a>,” was authored by Yangzhuo Li, Xinyu Cheng, Wanqiu Na, Junlong Luo, and Xianchun Li.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/fascinating-new-research-turns-the-trophy-wife-trope-on-its-head/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Fascinating new research turns the “trophy wife” trope on its head</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in <em><a href="https://doi.org/10.1016/j.ehb.2025.101543" target="_blank">Economics & Human Biology</a></em> provides evidence that the relationship between spousal income and physical fitness changes after a couple marries. The findings suggest that while men prioritize physical attractiveness in a partner more than women do when initially selecting a spouse, this dynamic shifts to a symmetrical exchange during the marriage itself. As one partner contributes more to the household income, the other partner tends to decrease their Body Mass Index and increase physical activity, regardless of gender.</p>
<p>Social scientists have studied the concept of “beauty-status exchange” for decades. This theoretical model proposes that in the dating and marriage market, individuals often trade one desirable trait for another. A classic example involves a partner with high socioeconomic status, often defined by income or wealth, pairing with a partner who exhibits high physical attractiveness.</p>
<p>Historical data and evolutionary psychology perspectives often framed this exchange as gendered. These theories posited that men place a higher premium on the physical appearance of potential mates, while women place a higher value on the ability of a mate to provide resources. This dynamic is frequently observed in the initial formation of a marriage or partnership.</p>
<p>Most prior research has focused on this static moment of match formation. Relatively little attention has been paid to how these exchanges evolve as a marriage progresses over time. Marriages are not static agreements but dynamic partnerships that function over many years.</p>
<p>Economic theories of marriage suggest that relationships require a continuous equilibrium to remain stable. If one partner’s contribution to the marriage changes, such as receiving a significant promotion or inheritance, the perceived value of the relationship shifts. This change can alter the balance of power or the perceived equity within the union.</p>
<p>To maintain stability, the other partner might subconsciously or consciously attempt to rebalance the relationship. If one spouse brings more financial status to the table, the other might compensate by enhancing other desirable traits, such as physical fitness. The author of the current study aimed to test whether this dynamic rebalancing actually occurs.</p>
<p>“In many countries, women now earn as much as – or more than – their husbands, and research shows that these shifts can reshape expectations and relationship dynamics,” explained <a href="https://researchportal.bath.ac.uk/en/persons/joanna-syrda/" target="_blank">Joanna Syrda</a> a lecturer at the University of Bath and author of the study. “At the same time, men are investing in their appearance more than ever, from fitness influencers to skincare routines and open conversations about body image. Bringing these two trends together, I began to wonder how the old idea of a ‘beauty-status exchange’ might be evolving, and that question ultimately inspired this project.”</p>
<p>For her study, Syrda utilized data from the Panel Study of Income Dynamics (PSID), a long-running longitudinal survey of families in the United States. The analysis focused on a sample of 3,744 heterosexual dual-earner couples. The data spanned the years 1999 through 2019.</p>
<p>The study relied on Body Mass Index (BMI) as a proxy for physical attractiveness and fitness. The PSID collects self-reported height and weight data, which the researcher used to calculate BMI for both husbands and wives. While BMI is an imperfect measure of attractiveness, previous studies indicate it correlates strongly with societal standards of physical appeal and health.</p>
<p>The primary independent variable was relative income. This measure captures the proportion of the total household labor income contributed by the wife. By focusing on relative rather than absolute income, the study could isolate the exchange dynamic between spouses.</p>
<p>The researcher employed two distinct analytical approaches. The first examined couples at the point of marriage selection to test for the traditional, static beauty-status exchange. The second approach used fixed-effects models to track the same couples over time, observing how changes in income related to changes in BMI.</p>
<p>The results regarding marriage selection aligned with traditional gender roles. At the time of marriage, a woman’s BMI was negatively associated with her husband’s relative income. This means that men who earned a larger share of the income tended to marry women with lower BMIs.</p>
<p>However, this pattern did not hold in reverse. A woman’s relative income appeared to have no significant statistical association with her husband’s BMI at the start of the marriage. This finding supports the existence of an asymmetrical, gendered exchange when couples are first matched.</p>
<p>The results differed substantially when the researcher analyzed the ongoing dynamics within the marriage. The asymmetry disappeared. During the marriage, the exchange of status for beauty became symmetrical.</p>
<p>The data indicated that an increase in one spouse’s relative income was associated with a decrease in the other spouse’s BMI. If a husband started earning a larger share of the household income, the wife tended to lower her BMI. Conversely, if a wife’s relative income increased, the husband tended to lower his BMI.</p>
<p>This effect extended to the risk of obesity. As a wife’s contribution to the household income rose, the husband’s probability of being overweight or obese declined. The study suggests that the partner with lower relative earnings compensates by maintaining or improving their physical appearance.</p>
<p>To understand how these physical changes occurred, the study also analyzed data on physical activity. The analysis revealed that shifts in relative income were linked to behavioral adjustments. When one spouse’s relative income increased, the other spouse reported higher frequencies of physical activity.</p>
<p>This finding implies that the changes in BMI are likely the result of purposeful effort rather than stress or incidental factors. The partner earning relatively less appears to invest more time and energy into fitness. This behavior aligns with the theory that spouses engage in compensatory actions to maintain their value in the relationship.</p>
<p>Syrda also explored whether education levels influenced these patterns. The results indicated that the dynamic is most pronounced among highly educated women. For college-educated wives, the link between earning a higher relative income and having a higher BMI was steeper than for those with less education.</p>
<p>This may reflect the high time costs associated with high-paying careers. For women in demanding professions, earning a higher income may leave less time for the maintenance of physical appearance. The opportunity cost of exercise and diet management becomes higher as their wages rise.</p>
<p>Conversely, the pressure for husbands to lose weight when their wives earn more was weaker among college-educated men. The negative association between a wife’s rising income and a husband’s obesity risk was attenuated or reversed for men with college degrees.</p>
<p>This suggests that high-skill men might respond to a loss of relative financial status differently. Instead of focusing on their appearance, they might prioritize their own careers to regain financial standing. Alternatively, they may derive status from other sources that insulate them from the pressure to improve their physical attractiveness.</p>
<p>“The beauty-status exchange has long been described as a gendered bargain – men offering status, women offering attractiveness, the classic ‘trophy wife’ idea,” Syrda told PsyPost. “But my research shows that for heterosexual couples this bargain doesn’t end at the wedding. It continues throughout the marriage, and both partners take part. When a wife’s share of income rises, her husband slims down; when a husband earns more, she does. The exchange lives on – but in a more equal, modern form that reflects women’s rising economic power.”</p>
<p>“The study also shows how this happens: an increase in one partner’s relative income predicts more physical activity in the other partner, pointing to intentional behaviour changes rather than random weight fluctuations.”</p>
<p>As with all research, the study has some limitations. The use of BMI as a primary measure of attractiveness is a simplification. Physical beauty is a multifaceted construct that includes many traits beyond weight-to-height ratios.</p>
<p>Additionally, the survey data relies on self-reported height and weight. While generally accurate, self-reports can introduce bias. The survey is collected every two years, which means immediate reactions to income changes might be missed.</p>
<p>Finally, the study focuses on heterosexual couples in the United States. The dynamics of beauty and status exchange may operate differently in same-sex couples or in cultures with different gender norms.</p>
<p>Despite these limitations, the research provides evidence that the economics of marriage are not set in stone at the wedding. The exchange of assets, whether financial or physical, appears to be a continuous negotiation. Both men and women appear to respond to shifts in their partner’s economic power by adjusting their own physical investments.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.ehb.2025.101543" target="_blank">(A)symmetries in beauty-status exchange: Spousal relative income and partners’ BMI (at) during marriage</a>,” was authored by Joanna Syrda.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/creatine-supplement-may-enhance-brain-function-during-menopause-new-research-suggests/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Creatine supplement may enhance brain function during menopause, new research suggests</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study suggests that specific low-dose formulations of creatine, a popular dietary supplement, may improve some aspects of cognitive function and elevate the compound’s levels within the brain in women navigating the menopausal transition. The research, published in the <em><a href="https://doi.org/10.1080/27697061.2025.2551184" target="_blank">Journal of the American Nutrition Association</a></em>, found that an eight-week supplementation regimen was associated with faster reaction times, improved lipid profiles, and a potential reduction in mood swings for perimenopausal and menopausal women.</p>
<p>The transition to menopause is characterized by significant hormonal shifts, most notably a decline in estrogen. These changes can affect multiple systems in the body, including the brain. Because estrogen helps regulate brain blood flow, energy production within cells, and the health of neurons, its decline can contribute to cognitive symptoms often described as “brain fog,” which includes memory lapses and reduced attention. This period can also coincide with structural changes in the brain and a decrease in its ability to efficiently use glucose, its primary fuel.</p>
<p>Creatine is a naturally occurring compound that plays a fundamental part in cellular energy management. It helps recycle the body’s main energy molecule, especially in tissues with high energy demands like muscle and brain tissue. Researchers have theorized that supplementing with creatine could help counteract some of the brain-related energy deficits associated with menopause. </p>
<p>The new study was conducted by a team of researchers from the University of Novi Sad in Serbia, led by Sergej M. Ostojic, to investigate whether low-dose versions of creatine, specifically creatine hydrochloride and creatine ethyl ester, could offer benefits without the need for high-dose regimens. These forms are believed to have enhanced solubility, which might improve their absorption.</p>
<p>The investigation was designed as a randomized, double-blind, placebo-controlled trial, a structure considered a high standard in clinical research. This design means that neither the participants nor the researchers knew who was receiving the active supplement or an inactive placebo, reducing the potential for bias. A total of 36 healthy perimenopausal and menopausal women between the ages of 40 and 60 participated in the study for eight weeks.</p>
<p>Participants were randomly assigned to one of four groups. One group received a low dose of creatine hydrochloride (750 milligrams per day), a second received a medium dose (1,500 milligrams per day), and a third group took a combination of creatine hydrochloride and creatine ethyl ester (800 milligrams per day). The fourth group received a placebo made of non-starch polysaccharides. </p>
<p>Before and after the eight-week period, the researchers assessed participants using a variety of measures, including computerized tests of cognitive function, questionnaires about menopause symptoms and fatigue, and blood tests. A subset of 16 participants also underwent a specialized brain imaging technique to measure creatine concentrations in different brain regions.</p>
<p>After eight weeks, the groups taking creatine supplements showed improvements in several cognitive areas compared to their baseline measurements. Participants in the medium-dose creatine hydrochloride group exhibited a significant increase in reaction time. The low-dose group showed improvements in alertness, executive control, and performance on a test measuring information processing speed. The combination supplement was also associated with better alertness and faster reaction times.</p>
<p>One of the study’s key observations came from the brain imaging analysis. All three groups that received a creatine supplement showed a significant rise in total creatine levels within the brain. These increases were particularly evident in the frontal regions, areas associated with mood, cognition, and memory. In contrast, the placebo group experienced a slight decrease in brain creatine levels in some of these same regions over the eight-week period.</p>
<p>The study also evaluated clinical symptoms related to menopause. Women taking the medium-dose creatine hydrochloride supplement reported a significant reduction in general fatigue and concentration difficulties. This group also showed a trend toward a reduction in the severity of mood swings. The group taking the combination of creatine hydrochloride and creatine ethyl ester reported a significant decrease in anxiety.</p>
<p>Analysis of blood samples revealed no significant changes in hormone levels or markers of inflammation across any of the groups. However, the researchers did observe favorable changes in blood lipids. The group receiving the combination supplement showed a significant reduction in both low-density lipoprotein cholesterol, often called “bad” cholesterol, and triglyceride levels. </p>
<p>The interventions were reported to be safe and well tolerated, with only minor side effects like transient heartburn noted in a few participants. Importantly, no group experienced weight gain, a common concern associated with creatine supplementation.</p>
<p>The researchers acknowledge some limitations to their study. The sample size was relatively small, which means the findings should be considered preliminary and require confirmation in larger populations. The eight-week duration may also be too short to detect longer-term effects on brain health or bone density. The study did not separate the results for perimenopausal and postmenopausal women, which could obscure differences between these distinct phases.</p>
<p>Additionally, the study relied on participants’ self-reports for symptoms, which can be subjective, and did not monitor diet or physical activity levels, which could have influenced the outcomes. The lead author, Sergej M. Ostojic, has existing connections to the dietary supplement industry, including serving on an advisory board and co-owning patents related to creatine. The supplements for the trial were supplied at no cost by a commercial entity. Future research should involve larger and more diverse groups of women over longer periods, with objective monitoring of lifestyle factors, to build upon these initial findings.</p>
<p>The study, “<a href="https://doi.org/10.1080/27697061.2025.2551184" target="_blank">The Effects of 8-Week Creatine Hydrochloride and Creatine Ethyl Ester Supplementation on Cognition, Clinical Outcomes, and Brain Creatine Levels in Perimenopausal and Menopausal Women (CONCRET-MENOPA): A Randomized Controlled Trial</a>,” was authored by Darinka Korovljeva, Jelena Ostojic, Jovana Panic, Marijana Ranisavljev, Nikola Todorovic, David Nedeljkovic, Jovan Kuzmanovic, Milan Vranes, Valdemar Stajer, and Sergej M. Ostojic.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-pinpoint-cellular-mechanism-behind-psilocins-effects-on-brain-activity/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists pinpoint cellular mechanism behind psilocin’s effects on brain activity</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 19th 2025, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study provides a detailed look at how the psychedelic compound psilocin acts on specific neurons within the brain. The research suggests that psilocin directly excites a particular population of brain cells in the medial prefrontal cortex, a region associated with cognition and mood, by activating a specific receptor and its internal signaling pathway. The findings were published in the <em><a href="https://doi.org/10.1038/s41398-025-03611-0" target="_blank">Molecular Psychiatry</a></em>.</p>
<p>Scientists are exploring psychedelic compounds for their potential to treat neuropsychiatric conditions like depression and anxiety. Psilocybin, which the body converts into psilocin, has shown promise in clinical settings. It is understood that these compounds primarily interact with a type of serotonin receptor known as the 5-HT2A receptor. </p>
<p>While brain imaging in humans has shown that psychedelics increase activity in the prefrontal cortex, the precise cellular mechanisms behind this effect have remained largely unclear. The researchers of this study sought to identify which specific neurons are affected by psilocin in this brain region and to map the exact molecular chain of events that leads to changes in their activity.</p>
<p>“Psychedelic compounds are currently being investigated for therapeutic application in a number of psychiatric diseases. Despite promising clinical results, the underlying mechanisms for these drugs are not yet completely understood and preclinical research (like our study) can shed light on the underlying neurobiological effects of these compounds, including psilocybin,” said study author <a href="https://www.thehermanlab.com/" target="_blank">Melissa Herman</a>, an associate professor at the University of North Carolina at Chapel Hill.</p>
<p>“This research was conducted with an MD-PhD student in the lab, Dr. Gavin Schmitz, and in collaboration with my colleague Dr. Bryan Roth who generated the transgenic mice used in the study.”</p>
<p>To investigate brain-wide effects, the research team first used functional magnetic resonance imaging, or fMRI, on mice. This technique measures changes in blood flow to infer brain activity. Anesthetized mice were administered either a neutral vehicle solution or psilocin at a dose of 2 mg/kg. The fMRI scans revealed that psilocin led to a significant increase in activity in the medial prefrontal cortex, particularly within two subregions known as the prelimbic and anterior cingulate cortices.</p>
<p>Following the brain-wide imaging, the scientists performed experiments on brain slices from mice to examine the activity of individual neurons. Using a technique called electrophysiology, which measures the electrical signals of cells, they first recorded from a general population of neurons called layer V pyramidal cells in the prefrontal cortex. </p>
<p>When psilocin was applied, the response was varied. About half of the neurons increased their firing rate, while around 30% decreased their firing, and the rest showed no change. This mixed result suggested that psilocin has different effects on different types of neurons in this area.</p>
<p>“I was initially surprised that psilocin produced variable effects in non-specified prefrontal cortex neurons (increases, decreases, or no change in activity) but consistently activated 5-HT2A neurons, but then we realized this was likely a central key to how these drugs engage the prefrontal cortex,” Herman told PsyPost.</p>
<p>To narrow their focus, the researchers used a genetically engineered mouse model in which neurons expressing the 5-HT2A receptor were labeled with a fluorescent marker. This allowed them to specifically identify and record from the cells that are the primary target of psychedelics. </p>
<p>When psilocin (10 µM) was applied directly to these identified 5-HT2A neurons, the effect was consistent and clear. The firing rate of these neurons reliably increased, approximately doubling to 200% of their baseline activity. Psilocin also made these neurons intrinsically more excitable, meaning it took less of an electrical stimulus to cause them to fire an action potential.</p>
<p>The researchers then explored whether this increased activity was a direct effect on the 5-HT2A neurons or an indirect effect caused by changes in the surrounding neural network. They measured the small, spontaneous electrical currents that neurons receive from their neighbors, which represent incoming excitatory or inhibitory signals. Psilocin did not alter the frequency or amplitude of these incoming signals, providing evidence that the drug acts directly on the 5-HT2A neurons to increase their excitability, rather than by altering the signals they receive from other cells.</p>
<p>To confirm that the 5-HT2A receptor was responsible for these effects, the team conducted a series of pharmacological experiments. They applied a compound called NBOH-2C-CN (200 nM), which selectively activates only the 5-HT2A receptor. This compound replicated the effects of psilocin, increasing the firing rate of the 5-HT2A neurons. </p>
<p>Next, they used a drug called M100907 (200 nM), which blocks the 5-HT2A receptor. When this blocker was applied before psilocin, the excitatory effect was completely prevented. These experiments together point to the 5-HT2A receptor as the key mediator of psilocin’s effects on these specific neurons.</p>
<p>The team also tested the involvement of a related receptor, the 5-HT2C receptor, which psilocin can also affect. Using a blocker for the 5-HT2C receptor did not prevent psilocin from increasing neuron firing, suggesting this receptor is not involved in the direct excitatory action. </p>
<p>The study also tested a novel, non-hallucinogenic compound that activates the 5-HT2A receptor. This compound also increased the firing of the neurons in a similar manner, suggesting that this mechanism of exciting neurons might be related to the therapeutic potential of these drugs, possibly separate from their hallucinogenic effects.</p>
<p>Finally, the scientists investigated the internal signaling pathway that the 5-HT2A receptor uses to change cell activity. When a receptor on a cell’s surface is activated, it triggers a cascade of events inside the cell. The 5-HT2A receptor is known to signal through a pathway involving a protein called Gαq. The researchers used an inhibitor called FR900359 (1 µM) that specifically blocks Gαq signaling. </p>
<p>When this inhibitor was present, both psilocin and the selective 5-HT2A activator failed to increase the firing of the neurons. This result indicates that the Gαq pathway is a necessary step in the chain of events linking 5-HT2A receptor activation to increased neuronal excitability.</p>
<p>“Psilocin, the active metabolite found in psilocybin, increases activity in the prefrontal cortex by acting at a specific population of neurons that contain the 5-HT2A receptor,” Herman said. “The prefrontal cortex and the 5-HT2A receptor are important for cognitive function and both are implicated in psychiatric disorders so the effects we see could be related to how these drugs may improve symptoms in human patients.”</p>
<p>The study has some limitations. The experiments were conducted in mice, and while animal models provide powerful insights into biological mechanisms, the results do not always translate directly to humans. The sample size for the fMRI portion of the study was also relatively small. Future research could aim to confirm these findings in larger animal cohorts and investigate how these cellular changes in the prefrontal cortex relate to the behavioral and therapeutic effects of psychedelics.</p>
<p>“The long-term goals of this research are to understand how psychedelic compounds change activity and processing across the brain, how these changes are the same (or different) in males and females, and how those changes are impacted by pre-existing conditions like stress or exposure to conditions that produce symptoms related to human psychiatric disease,” Herman explained.</p>
<p>These findings contribute a more detailed picture of how a psychedelic compound like psilocin works at the cellular and molecular level. By showing that psilocin directly excites a specific group of neurons in the prefrontal cortex through the 5-HT2A receptor and the Gαq pathway, this work helps to uncover the neurobiological mechanisms that may be involved in the compound’s potential therapeutic actions. </p>
<p>“Although psychedelics show significant promise for potential therapeutic use, some of the human data may be complicated by placebo effects or expectancy bias and these drugs are exceedingly unlikely to ‘cure everything’ or be effective in all individuals and therapeutic use must be supported by rigorous research,” Herman cautioned.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41398-025-03611-0" target="_blank">Psychedelic compounds directly excite 5-HT2A layer V medial prefrontal cortex neurons through 5-HT2A Gq activation</a>,” was authored by Gavin P. Schmitz, Yi-Ting Chiu, Mia L. Foglesong, Sarah N. Magee, Martin MacKinnon, Gabriele M. König, Evi Kostenis, Li-Ming Hsu, Yen-Yu I. Shih, Bryan L. Roth, and Melissa A. Herman.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>