<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/women-display-more-fluidity-in-sexual-attractions-and-fantasies-than-men/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Women display more fluidity in sexual attractions and fantasies than men</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 13th 2026, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new analysis of data from over 50,000 individuals indicates that men exhibit a more exclusive pattern of sexual attraction than women do. The research shows that while men strongly prefer one gender over the other, women tend to display a wider range of potential attractions. These results appear in <em><a href="https://doi.org/10.1080/00224499.2025.2545965" target="_blank" rel="noopener">The Journal of Sex Research</a></em>.</p>
<p>For decades, researchers have attempted to map the differences in how men and women experience sexual desire. Older investigations often relied on measuring physical signs of arousal in a laboratory setting. Those experiments frequently suggested that men are “gender-specific.” This means men typically show physical arousal only when viewing the gender they prefer.</p>
<p>In contrast, those same historical studies often found that straight women displayed physical arousal when viewing images of both men and women. This led to a prevailing theory that female sexuality is inherently less specific than male sexuality. However, it remained unclear if this pattern applied to psychological feelings of attraction or fantasies.</p>
<p>Sapir Keinan-Bar, Yoav Bar-Anan, and Daphna Joel conducted the current investigation to answer this question. They are researchers affiliated with the School of Psychological Sciences and the Sagol School of Neuroscience at Tel-Aviv University. They sought to determine if the gender gap in specificity exists when measuring self-reported feelings and subconscious associations. They also aimed to see how these patterns manifest across different sexual orientations.</p>
<p>The team aggregated data from three separate large-scale online datasets. The total pool of participants included 56,892 individuals. The datasets contained information from volunteers who had visited research websites or utilized paid survey platforms.</p>
<p>The researchers analyzed responses to direct questions regarding sexual identity. Participants rated their level of attraction to men and women on numerical scales. They also reported the frequency of their erotic fantasies involving men or women. This allowed the authors to compare conscious reports of desire.</p>
<p>In addition to direct questions, the study utilized indirect measures of attraction. One primary tool used was the Implicit Association Test (IAT). This computerized task measures the strength of mental links between concepts.</p>
<p>During an IAT, a participant might sort words or images into categories like “Men” or “Women” and “I am sexually attracted” or “I am not sexually attracted.” The speed at which a participant sorts these items reveals their automatic associations. A faster response time suggests a stronger underlying mental connection.</p>
<p>The researchers also used a variation called the Questionnaire-Based Implicit Association Test (qIAT). This version uses statements rather than single words or images. It assesses attraction to men and women separately rather than comparing them directly.</p>
<p>The analysis of this massive dataset revealed a consistent pattern. Men generally exhibited greater gender-specificity than women. This trend appeared across self-reported attraction, fantasy frequency, and the indirect association measures.</p>
<p>The data provided a detailed look at why this gap exists. Men reported very high levels of attraction toward their preferred gender. At the same time, they reported very low levels of attraction toward their non-preferred gender. This created a large statistical gap between their likes and dislikes.</p>
<p>Women showed a different profile. They reported slightly lower levels of attraction to their preferred gender compared to men. More importantly, they reported higher levels of attraction to their non-preferred gender than men did. This finding suggests that women are psychologically more open to their non-preferred gender.</p>
<p>The study clarified the nature of attraction among heterosexual women. Contrary to some interpretations of older physiological studies, straight women were not completely non-specific. They clearly preferred men over women in both self-reports and indirect measures.</p>
<p>However, the intensity of this preference was not as exclusive as the preference straight men held for women. Straight women demonstrated a distinct preference, but the separation was less extreme. The researchers noted that this pattern was robust across the different samples.</p>
<p>The study also examined individuals who identified as gay or lesbian. The researchers found that the gender gap in specificity was different in these groups. The large difference seen between straight men and women was often smaller, absent, or reversed among gay and lesbian participants.</p>
<p>For example, lesbian women showed levels of specificity that were sometimes similar to, or even higher than, gay men. This suggests that the high degree of exclusivity observed in straight men might be a unique characteristic of that specific group. It may not be a universal trait of male sexuality.</p>
<p>The analysis of sexual fantasies reinforced the findings regarding attraction. Men reported fantasies almost exclusively about their preferred gender. Women reported fantasies primarily about their preferred gender, but with more frequent exceptions than men.</p>
<p>The authors evaluated several theoretical explanations for these results. One common theory posits that men simply have a higher sex drive than women. The data presented a challenge to this idea.</p>
<p>If men simply had a higher sex drive, they should report higher attraction to everyone. Instead, women reported higher attraction to their non-preferred gender than men did. This indicates that the difference is not just about the total amount of sexual desire.</p>
<p>Another theory considers the impact of social norms. Society often imposes strict expectations on heterosexual masculinity. Men face social penalties for showing interest in other men.</p>
<p>This social pressure might encourage men to report extreme attraction to women and deny any attraction to men. This would create the highly specific pattern observed in the data. Women generally face less social stigma for expressing flexibility in their attractions.</p>
<p>The authors also discussed the theory of sexual objectification. Western culture frequently portrays women as sexual objects. This cultural conditioning might cause individuals of all genders to develop some degree of attraction toward women.</p>
<p>The results offered some support for this objectification hypothesis. Across the board, attraction to the non-preferred gender was higher when that gender was female. For instance, straight women reported more attraction to women than straight men reported to men.</p>
<p>The researchers pointed out the benefits of using detailed categories for sexual orientation. The study allowed participants to identify as “mostly straight” or “mostly gay” rather than just using three rigid categories. This nuance revealed that people in the “mostly” categories drove much of the flexibility seen in the data.</p>
<p>Women were more likely than men to identify with these “mostly” categories. Men were more likely to identify as “exclusively” straight or gay. This difference in self-identification aligns with the finding that men are more gender-specific in their attractions.</p>
<p>There are limitations to this research. The data came from online samples, which may not perfectly represent the general population. The participants were primarily English speakers and likely skewed younger and more liberal.</p>
<p>The measures relied on honesty in self-reporting and the assumption that reaction times reflect attraction. These are proxies for real-world experience. The study did not measure physiological arousal, so it cannot be directly compared to the older laboratory studies on that metric.</p>
<p>Future research could explore these patterns in different cultures. Examining societies with different gender norms could help separate biological tendencies from social conditioning. It would be useful to see if the high specificity of straight men persists in cultures with different concepts of masculinity.</p>
<p>The study, “<a href="https://doi.org/10.1080/00224499.2025.2545965" target="_blank" rel="noopener">Gender-Specificity in Sexual Attraction and Fantasies: Evidence from Self-Report and Indirect Measures</a>,” was authored by Sapir Keinan-Bar, Yoav Bar-Anan, and Daphna Joel.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/scientists-identify-five-distinct-phases-of-brain-structure-across-the-human-lifespan/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Scientists identify five distinct phases of brain structure across the human lifespan</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 13th 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research indicates that the structural organization of the human brain does not develop in a continuous, linear fashion but rather progresses through five distinct phases separated by specific turning points. By analyzing brain scans from thousands of individuals ranging from infants to ninety-year-olds, scientists identified significant shifts in neural architecture occurring around ages nine, 32, 66, and 83.</p>
<p>These findings, published in <em><a href="https://doi.org/10.1038/s41467-025-65974-8" target="_blank" rel="noopener">Nature Communications</a></em>, provide a new framework for understanding how the brain reorganizes itself throughout the human lifespan and suggests that structural adolescence may extend well into the third decade of life.</p>
<p>Previous research has established that brain structure and function evolve as people age. However, many of these studies focused on specific developmental windows, such as early childhood or old age, rather than the entire life course. When studies did examine broader age ranges, they often relied on models that assumed smooth, gradual trajectories, such as a simple peak in adulthood followed by a steady decline.</p>
<p>The authors of the new study argued that these approaches might miss complex, non-linear shifts in how the brain is organized. By mapping these structural changes across the full spectrum of life, the research team aimed to create a baseline for typical development.</p>
<p>“We know that the brain changes it’s connections across the lifespan, however, we’ve lacked clarity on the general patterns of these changes. This is important information because the way the brain is wired is related to neurodevelopment, mental health disorders, and neurological conditions,” said study author <a href="https://www.alexamousley.com/" target="_blank" rel="noopener">Alexa Mousley</a>, a postdoctoral research associate at the University of Cambridge.</p>
<p>“Therefore, knowing what the brain is expected to be doing at specific points in time, we might better understand what the brain is best at or most vulnerable to specific ages. For example, 2/3rds of people who will develop a mental health disorder do so before the age of 25.”</p>
<p>To construct this lifespan trajectory, the researchers aggregated data from nine distinct neuroimaging datasets. This compilation resulted in a total sample of 4,216 individuals, aged zero to 90 years. From this larger group, the team analyzed a subset of 3,802 scans from neurotypical individuals to establish a standard model of development.</p>
<p>The researchers utilized diffusion-weighted imaging, a type of MRI that tracks the movement of water molecules in brain tissue. This method allows scientists to map white matter tracts, which serve as the cabling system connecting different regions of the brain.</p>
<p>The research team employed graph theory to analyze the organization of these brain networks. They calculated twelve specific metrics to describe the topology of the brain. Topology refers to the way parts of a network are arranged and connected.</p>
<p>Key metrics included global efficiency, which measures how easily information can travel across the whole network, and modularity, which assesses how well the network is divided into specialized, self-contained communities.</p>
<p>To make sense of this high-dimensional data, the researchers utilized a machine learning technique known as Uniform Manifold Approximation and Projection (UMAP). This method projects complex, multi-variable data into a lower-dimensional space, allowing researchers to visualize distinct patterns and trajectories.</p>
<p>By analyzing these projections, the research team developed an algorithm to identify “turning points.” These points represent ages where the trajectory of brain development shifts significantly, indicating a transition from one phase of organizational change to another.</p>
<p>The analysis revealed four major turning points across the lifespan, occurring approximately at ages nine, 32, 66, and 83. These boundaries define five distinct epochs of structural development.</p>
<p>The first epoch spans from birth to age nine. This period corresponds with rapid changes in brain volume and the consolidation of neural networks. In early childhood, the brain overproduces synapses, which are subsequently pruned back to remove weak or unnecessary connections.</p>
<p>The study data indicates that topological efficiency decreases during this phase, while local segregation increases. The turning point at age nine aligns with the typical onset of puberty and significant increases in cognitive capabilities.</p>
<p>The second epoch, lasting from age nine to 32, represents a prolonged period of “adolescence” in terms of brain structure. During this phase, the brain focuses on increasing network integration.</p>
<p>The researchers observed that connections became more efficient, allowing for rapid communication across the entire brain. This period is characterized by a rise in “small-worldness,” a property where a network is both highly clustered locally and efficiently connected globally.</p>
<p>The turning point at age 32 was identified as the strongest shift in the entire lifespan. It marks the end of the period where efficiency increases and signals a transition into a new trajectory.</p>
<p>The third epoch extends from age 32 to 66, covering the majority of adulthood. In contrast to the rapid changes of the previous epoch, this phase is characterized by relative stability and a plateau in network integration.</p>
<p>The dominant trend during these years is an increase in segregation, meaning brain regions become more compartmentalized. This period aligns with observations from psychological research suggesting that personality and fluid intelligence tend to stabilize during middle adulthood.</p>
<p>The fourth epoch runs from age 66 to 83, labeled as early aging. The turning point at age 66 coincides with the typical onset of age-related health conditions, such as hypertension, which can impact brain health.</p>
<p>During this phase, the researchers observed a decline in the network integrity that was built up earlier in life. The trajectory shifts toward a simplification of the network structure, often associated with the degeneration of white matter connections.</p>
<p>The final epoch identified in the study begins at age 83 and extends to age 90, the upper limit of the sample. This late aging phase is marked by further reductions in global connectivity.</p>
<p>The topology of the brain shifts such that the importance of individual nodes in the network becomes more critical than the global connections. The researchers noted that the relationship between age and topological organization appears to weaken in this final stage, although the sample size for this oldest group was smaller than for younger groups.</p>
<p><img fetchpriority="high" decoding="async" class="aligncenter size-large wp-image-230440" src="https://www.psypost.org/wp-content/uploads/2026/01/AllEras-1024x217.png" alt="" width="1024" height="217" srcset="https://www.psypost.org/wp-content/uploads/2026/01/AllEras-1024x217.png 1024w, https://www.psypost.org/wp-content/uploads/2026/01/AllEras-300x64.png 300w, https://www.psypost.org/wp-content/uploads/2026/01/AllEras-768x163.png 768w, https://www.psypost.org/wp-content/uploads/2026/01/AllEras-1536x326.png 1536w, https://www.psypost.org/wp-content/uploads/2026/01/AllEras-2048x434.png 2048w, https://www.psypost.org/wp-content/uploads/2026/01/AllEras-750x159.png 750w, https://www.psypost.org/wp-content/uploads/2026/01/AllEras-1140x242.png 1140w" sizes="(max-width: 1024px) 100vw, 1024px"></p>
<p>One of the most notable outcomes of this analysis is the alignment of these structural turning points with major biological and social milestones. The shift at age nine corresponds with the beginning of hormonal changes associated with puberty and increased vulnerability to mental health issues.</p>
<p>The turning point at 32 aligns with the cessation of white matter growth and the peak of certain physical capabilities. The shift at 66 matches the retirement age in many societies and the increased prevalence of cognitive decline.</p>
<p>The findings indicate that “the brain develops non-linearly – it’s not one steady increase or decline,” Mousley told PsyPost. “The brain goes through phases where it changes differently than at other points in the lifespan.”</p>
<p>While the study provides a comprehensive overview of population-level trends, it contains limitations inherent to its design. The research utilized cross-sectional data, meaning it compared different individuals at different ages rather than tracking the same individuals over time. Consequently, it remains unclear if every individual follows these exact trajectories or if the turning points vary significantly from person to person.</p>
<p>The researchers also noted that the study focused exclusively on brain structure. The findings describe how neural “hardware” changes but do not directly measure behavior, maturity, or cognitive ability.</p>
<p>The extension of the “adolescent” structural phase to age 32 does not imply that individuals in their thirties exhibit adolescent behavior, but rather suggests that their brain networks are still optimizing for efficiency in a manner similar to younger adults.</p>
<p>“Many people are interested in the finding that adolescence goes until around 32 years old,” Mousley said. “It is important to note that we ONLY studied changes in brain architecture. Therefore, our work does not support anything about behavior or cognition.”</p>
<p>The researchers suggest that future research should apply these methods to longitudinal datasets to validate the trajectories at the individual level. Additionally, investigating how these topological turning points differ in individuals with neurodevelopmental disorders or mental health conditions could provide new insights into the biological underpinnings of these challenges.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41467-025-65974-8" target="_blank" rel="noopener">Topological turning points across the human lifespan</a>,” was authored by Alexa Mousley, Richard A. I. Bethlehem, Fang-Cheng Yeh, and Duncan E. Astle.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/dead-at-24-from-dementia-how-a-young-mans-final-gift-could-change-brain-research-forever/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Dead at 24 from dementia – how a young man’s final gift could change brain research forever</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 12th 2026, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A UK man who is thought to be Britain’s youngest dementia sufferer recently passed away from the disease at only 24 years old. Andre Yarham, from Norfolk in England, was just 22 when he was first diagnosed with dementia.</p>
<p>At the age of 24, most brains are still settling into adulthood. But Yarham’s brain looked decades older — resembling the <a href="https://www.thesun.co.uk/health/37828227/uks-youngest-dementia-patient-dies-brain-70-year-old/">brain of a 70-year-old</a>, according to the MRI scan that helped diagnose him with the disease.</p>
<p>Yarham initially <a href="https://www.independent.co.uk/bulletin/news/youngest-dementia-andre-yarham-death-b2896596.html">began exhibiting symptoms</a> of dementia in 2022, with family saying he had become increasingly forgetful and would sometimes have a blank expression on his face.</p>
<p>In the final stages of his life, he lost his speech, could no longer care for himself, behaved “inappropriately” and was bound to his wheelchair.</p>
<p>Dementia is usually associated with old age. However, some forms of dementia can strike astonishingly early and move frighteningly fast. Take <a href="https://doi.org/10.3390/ijms241411732">frontotemporal dementia</a>, for instance. This was the <a href="https://www.independent.co.uk/news/uk/home-news/youngest-dementia-patient-andre-yarham-norfolk-b2896579.html">form of dementia</a> that Yarham was diagnosed with.</p>
<p>Unlike Alzheimer’s disease, which tends to affect memory first, frontotemporal dementia attacks the parts of the brain involved in personality, behaviour and language. These regions sit behind the forehead and above the ears in the <a href="https://www.sciencedirect.com/science/article/abs/pii/095943889390204C?via%3Dihub">frontal</a> and <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC3420617/">temporal</a> lobes.</p>
<p>These areas help us plan, control impulses, understand speech and express ourselves. When they’re damaged, people may change in ways that are deeply distressing for families – becoming withdrawn, impulsive or unable to communicate.</p>
<p>Frontotemporal dementia is a less common form of dementia, thought to account for around <a href="https://www.dementiauk.org/wp-content/uploads/dementia-uk-understanding-frontemporal-dementia.pdf">one in 20</a> cases. What makes it especially cruel is that it can appear in young adulthood.</p>
<p>In many cases, frontotemporal dementia has a <a href="https://doi.org/10.1177/0891988710383574">strong genetic component</a>. Changes in specific genes can disrupt how brain cells handle proteins. Instead of these proteins being broken down and recycled, they clump together inside the neurons (brain cells) – interfering with their ability to function and survive. Over time, affected brain cells stop working and die. As more cells are lost, the brain tissue itself shrinks.</p>
<p>Why this process can sometimes begin so early in life is still not fully understood. However, when a person has a <a href="https://doi.org/10.1016/j.brainres.2016.04.004">powerful genetic mutation</a>, the disease does not need decades to unfold. Instead, the mutation allows the damage to accelerate and the brain’s usual resilience fails.</p>
<p>Brain scans carried out while Yarham was alive showed striking shrinkage for someone so young. But to compare Yarham’s brain with that of someone in their 70s would be misleading. His brain had not “aged faster” in the usual sense. Instead, large numbers of neurons had been lost in a short period of time because of the disease.</p>
<p>In <a href="https://doi.org/10.3389/fnagi.2017.00412">healthy ageing</a>, the brain changes slowly. Certain regions become a little thinner, but the overall structure remains intact for decades. But in aggressive forms of dementia, whole brain networks collapse at once.</p>
<p>In frontotemporal dementia, the frontal and temporal lobes can shrink dramatically. As these regions deteriorate, people lose the abilities that those areas support – including speech, emotional control and decision-making abilities. This would explain why Yarham lost language so late but so suddenly – and why his need for full-time care escalated so quickly.</p>
<h2>Brain donation</h2>
<p>Yarham’s family decided to <a href="https://www.bbc.co.uk/news/articles/c2k9xp5je1go.amp">donate his brain to research</a>. This is an extraordinary gift – one that transforms tragedy into hope for others.</p>
<p>Dementia currently has no cure. Once symptoms begin, there’s no way to stop them and treatments which slow symptoms have limited effects. Part of the reason for this is because the brain is vastly complex and still not entirely understood. Every donated brain helps close that gap.</p>
<p>Brains affected by very early dementia are exceptionally rare. Each donated brain allows scientists to study, in fine detail, what went wrong at the level of cells and proteins. Although brain scans can tell us what brain parts have been lost, only donated tissue can reveal why.</p>
<p>Researchers can examine which proteins accumulated, which cell types were most vulnerable and how inflammation and immune responses may have contributed to the damage. That knowledge feeds directly into efforts to develop treatments that slow, stop or even prevent dementia.</p>
<p>The family’s decision to allow scientists to study tissue from such a rare, early-onset case of frontotemporal dementia could help unlock secrets that may guide treatments for generations to come.</p>
<p>As a neuroscientist, I have been asked how something like this can happen to someone so young. The honest answer is that we are only beginning to understand the biology that makes some brains vulnerable from the very start.</p>
<p>Cases like this underline why sustained investment in brain research, and the generosity of people willing to donate tissue, matters so deeply. The 24-year-old’s story is a reminder that dementia is not a single disease, and not a problem confined to old age. Understanding why it happened will be one small step toward making sure it does not happen again.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/272972/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/dementia-at-just-24-years-old-how-britains-youngest-sufferer-may-help-researchers-understand-the-disease-272972">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/social-media-not-gaming-tied-to-rising-attention-problems-in-teens-new-study-finds/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Social media, not gaming, tied to rising attention problems in teens, new study finds</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 12th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>The digital revolution has become a vast, unplanned experiment – and children are its most exposed participants. As ADHD diagnoses rise around the world, a key question has emerged: could the growing use of digital devices be playing a role?</p>
<p>To explore this, <a href="https://doi.org/10.1542/pedsos.2025-000922">we studied</a> more than 8,000 children, from when they were around ten until they were 14 years of age. We asked them about their digital habits and grouped them into three categories: gaming, TV/video (YouTube, say) and social media.</p>
<p>The latter included apps such as TikTok, Instagram, Snapchat, X, Messenger and Facebook. We then analysed whether usage was associated with long-term change in the two core <a href="https://www.nhs.uk/conditions/adhd-children-teenagers/">symptoms</a> of ADHD: inattentiveness and hyperactivity.</p>
<p>Our main finding was that social media use was associated with a gradual increase in inattentiveness. Gaming or watching videos was not. These patterns remained the same even after accounting for children’s genetic risk for ADHD and their families’ income.</p>
<p>We also tested whether inattentiveness might cause children to use more social media instead. It didn’t. The direction ran one way: social media use predicted later inattentiveness.</p>
<p>The mechanisms of how digital media affects attention are unknown. But the lack of negative effect of other screen activities means we can rule out any general, negative effect of screens as well as the popular notion that all digital media produces <a href="https://carecounseling.healthcare/blog/dopamine-hits-what-they-are-and-why-everyones-talking-about-them-by-care-counseling-inc">“dopamine hits”</a>, which then mess with children’s attention.</p>
<p>As cognitive neuroscientists, we could make an educated guess about the mechanisms. Social media introduces constant distractions, preventing sustained attention to any task.</p>
<p>If it is not the messages themselves that distract, the mere thought of whether a message has arrived can act as a mental distraction. These distractions impair focus in the moment, and when they persist for months or years, they may also have long-term effects.</p>
<p>Gaming, on the other hand, takes place during limited sessions, not throughout the day, and involves a constant focus on one task at a time.</p>
<p>The effect of social media, using statistical measures, was not large. It was not enough to push a person with normal attention into ADHD territory. But if the entire population becomes more inattentive, many will cross the diagnostic border.</p>
<p>Theoretically, an increase of one hour of social media use in the entire population would increase the diagnoses <a href="https://publications.aap.org/pediatricsopenscience/article/doi/10.1542/pedsos.2025-000922/205729/Digital-Media-Genetics-and-Risk-for-ADHD-Symptoms?autologincheck=redirected">by about 30%</a>. This is admittedly a simplification, since diagnoses depend on many factors, but it illustrates how even an effect that is small at the individual level can have a significant effect when it affects an entire population.</p>
<p>A lot of data suggests that we have seen at least one hour more per day of social media during the last decade or two. Twenty years ago, social media barely existed. Now, teenagers are online for about <a href="https://news.gallup.com/poll/512576/teens-spend-average-hours-social-media-per-day.aspx">five hours per day</a>, mostly with social media.</p>
<p>The percentage of teenagers who <a href="https://actforyouth.org/adolescence/demographics/internet.cfm">claim to be “constantly online”</a> has increased from 24% in 2015 to 46% 2023. Given that social media use has risen from essentially zero to around five hours per day, it may explain a substantial part of the increase in ADHD diagnoses during the past 15 years.</p>
<h2>The attention gap</h2>
<p>Some argue that the rise in the number of ADHD diagnoses reflects greater awareness and reduced stigma. That may be part of the story, but it doesn’t rule out a genuine increase in inattention.</p>
<p>Also, some studies that claim that the symptoms of inattention have not increased have often studied children who were probably <a href="https://acamh.onlinelibrary.wiley.com/doi/abs/10.1111/jcpp.13700">too young to own a smartphone</a>, or a period of years that <a href="https://acamh.onlinelibrary.wiley.com/doi/abs/10.1111/jcpp.12882?casa_token=iFCzyGKPAg4AAAAA:ZO3AhZYxMjtpc4q-45aHhmg3hFAWFKgIYPDHB1mZGevAR8_KDcbBzxE_3cWbv6pyYC0Y_0YrxRhsCRFJ">mostly predates</a> the avalanche in scrolling.</p>
<p>Social media probably increases inattention, and social media use has rocketed. What now? The US requires children to be <a href="https://www.ftc.gov/business-guidance/resources/complying-coppa-frequently-asked-questions">at least 13</a> to create an account on most social platforms, but these restrictions are easy to outsmart.</p>
<p>Australia is currently going the furthest. From <a href="https://www.esafety.gov.au/about-us/industry-regulation/social-media-age-restrictions">December 10 2025</a>, media companies will be required to ensure that users are 16 years or above, with high penalties for the companies that do not adhere. Let’s see what effect that legislation will have. Perhaps the rest of the world should follow the Australians.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/271144/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p> </p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/social-media-not-gaming-tied-to-rising-attention-problems-in-teens-new-study-finds-271144">original article</a>.</em></p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/insecure-attachment-is-linked-to-machiavellian-personality-traits/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Insecure attachment is linked to Machiavellian personality traits</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 12th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new analysis of psychological data suggests that manipulative personality traits may stem from deep-seated insecurities regarding social bonding. Researchers found that individuals who struggle to form secure emotional attachments are more likely to exhibit characteristics associated with Machiavellianism.</p>
<p>These findings indicate that dark personality traits may function as defensive mechanisms developed in response to unstable relationships. The study was published in the <em><a href="https://doi.org/10.1177/02654075251331679" target="_blank" rel="noopener">Journal of Social and Personal Relationships</a></em>.</p>
<p>To understand these findings, it is necessary to look at two distinct areas of psychological research. The first area is the concept of Machiavellianism. This construct draws its name from Niccolò Machiavelli and his political philosophy. It represents a personality trait defined by a willingness to manipulate others, a cynical view of human nature, and a belief that the ends justify the means.</p>
<p>Psychologists often group Machiavellianism with narcissism and psychopathy under the umbrella of the “Dark Triad.” People with high levels of this trait are often described as having a “cool syndrome.” They tend to detach emotionally from others to maintain control. They view other people as tools to be used rather than individuals to be connected with.</p>
<p>The second concept is attachment theory, which was originally developed by John Bowlby and Mary Ainsworth. This theory posits that the bonds formed between children and their caregivers create a blueprint for all future relationships. This blueprint is known as an internal working model.</p>
<p>When caregivers are responsive and supportive, children typically develop a secure attachment style. They grow up viewing themselves as worthy of love and others as trustworthy. However, when care is inconsistent or negligent, children may develop insecure attachment styles.</p>
<p>There are several forms of insecure attachment. Anxious attachment involves a fear of rejection and a constant need for validation. Avoidant attachment involves a discomfort with intimacy and a preference for excessive independence.</p>
<p>Previous research has attempted to link these two psychological constructs. The logic is that people who do not trust others due to early attachment failures might resort to manipulation to get their needs met. However, prior studies produced inconsistent results. Some papers found strong links, while others found little to no connection.</p>
<p>To clarify this relationship, a team of researchers led by Yihan Zhang from the University of Macau decided to aggregate the existing data. They performed a meta-analysis. This is a statistical method that combines the results of multiple independent studies to identify broader trends.</p>
<p>The researchers searched through six major academic databases for relevant literature. They looked for studies that measured both insecure attachment and Machiavellianism using validated psychological scales. They applied strict criteria for inclusion.</p>
<p>The team excluded studies involving clinical patients to focus on the general population. They also ensured that the statistical data in the papers could be converted into a common format for comparison. After a comprehensive screening process, they selected 27 articles.</p>
<p>These articles provided 86 different effect sizes. The total sample size across all studies included 13,791 participants. The participants ranged from teenagers to middle-aged adults and came from various regions, including North America, Europe, and Asia.</p>
<p>The researchers utilized a three-level random-effects model to analyze the data. This statistical approach allowed them to account for variations within individual studies as well as differences between the studies.</p>
<p>The primary finding was a positive correlation between insecure attachment and Machiavellianism. This correlation was statistically significant. It suggests that as an individual’s level of attachment insecurity rises, their tendency toward Machiavellian behavior also increases.</p>
<p>The researchers propose that this link exists because insecurely attached people hold biased mental representations. They often filter out positive social cues and amplify negative ones. This reinforces a skeptical view of human nature.</p>
<p>If a person expects others to be hostile or unreliable, they may adopt manipulative strategies as a form of self-protection. This aligns with the Machiavellian worldview that it is better to manipulate than to be manipulated.</p>
<p>Beyond the general link, the researchers conducted a moderator analysis. They wanted to see if specific types of insecure attachment were more strongly connected to Machiavellianism than others. This deep dive revealed nuanced results.</p>
<p>The analysis showed that “disorganized” and “fearful-avoidant” attachment patterns had the strongest associations with Machiavellian traits. Disorganized attachment is often the result of childhood experiences where the caregiver was a source of fear.</p>
<p>People with disorganized attachment possess chaotic internal working models. They simultaneously desire intimacy and fear it. The researchers explain that this internal conflict can lead to suspicion and hostility. These individuals may prioritize self-protection above all else.</p>
<p>Fearful-avoidant individuals feel unworthy of love and believe others are incapable of loving them. They often experience a low sense of social belonging. The study suggests these individuals may use manipulation to survive socially because they lack faith in genuine emotional bonds.</p>
<p>The study also examined anxious attachment. Anxiously attached people tend to be over-reliant on others. They may use manipulative tactics to induce guilt or solicit attention. The analysis confirmed a link here as well, though it operates differently than the avoidant types.</p>
<p>The researchers found that the method used to measure these traits influenced the results. Different psychological scales capture slightly different aspects of personality. However, the connection remained robust across various measurement tools.</p>
<p>These findings imply that Machiavellianism is not simply a malicious trait chosen voluntarily. It may be rooted in hostile family environments or negative early experiences. Inappropriate parenting practices can impede the development of amicable social strategies.</p>
<p>When children are exposed to environments where their needs are ignored, they develop coping mechanisms. Machiavellianism may serve as a protective shell. It allows insecure individuals to navigate a social world they perceive as dangerous.</p>
<p>There are limitations to this study that should be noted. The analysis was restricted to articles written in English. This may have excluded relevant data from other linguistic regions.</p>
<p>The number of included studies was relatively small at 27. While the sample size of participants was large, a higher number of studies would allow for more detailed moderator analyses.</p>
<p>Some potential moderators could not be fully explored due to insufficient data. For example, the study could not fully assess how gender might influence the strength of the relationship between attachment and manipulation.</p>
<p>The researchers also noted that the study relies on correlational data. This means they cannot definitively say that insecure attachment causes Machiavellianism. It only shows that the two are related.</p>
<p>Future research is needed to establish causality. Longitudinal studies that follow individuals from childhood into adulthood would be particularly useful. Such studies could track how early attachment styles evolve into adult personality traits.</p>
<p>The researchers suggest that future studies should also explore other variables. Factors such as gender and specific attachment figures (parents versus romantic partners) might play a role.</p>
<p>Despite these caveats, the study offers practical implications for mental health treatment. It highlights the importance of fostering attachment security.</p>
<p>Therapists treating individuals with high levels of manipulativeness might benefit from focusing on underlying insecurities. Helping a patient develop a more secure attachment style could theoretically reduce their reliance on Machiavellian tactics.</p>
<p>If clinicians can address the root cause—the fear of rejection or betrayal—the need for defensive manipulation may decrease. This suggests a potential pathway for intervention.</p>
<p>The study, “<a href="https://doi.org/10.1177/02654075251331679" target="_blank" rel="noopener">The relationship between insecure attachment and Machiavellianism: A meta-analysis</a>,” was authored by Yihan Zhang, Yihui Wang, Xiyu Jiang, Xinyun Li, and Juan Zhang.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/the-unexpected-interaction-between-cbd-and-thc-in-caffeinated-beverages/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">The unexpected interaction between CBD and THC in caffeinated beverages</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 12th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study suggests that adding cannabidiol, commonly known as CBD, to products containing caffeine and delta-9-tetrahydrocannabinol (THC) can intensify the drug’s psychoactive effects and increase impairment. The research indicates that while caffeine alone does not substantially alter the body’s processing of THC, the addition of CBD changes how the body metabolizes the intoxicating compound. These findings were published in the journal <em><a href="https://doi.org/10.1038/s41386-025-02232-x" target="_blank">Neuropsychopharmacology</a></em>.</p>
<p>The legal landscape for cannabis is shifting rapidly across the United States. State-level legalization and federal provisions such as the 2018 Farm Bill have led to a proliferation of cannabis-derived products. One emerging trend in the commercial marketplace is the sale of beverages that mix cannabinoids with caffeine. These products are often marketed as energy boosters or alternatives to alcoholic drinks.</p>
<p>Despite the growing availability of these mixtures, very little scientific data exists regarding how these chemical constituents interact within the human body. Public health officials and regulators lack precise information on whether combining stimulants like caffeine with depressants or psychoactive compounds like THC creates unique safety risks. Most existing knowledge comes from animal studies, which have suggested that caffeine might worsen memory deficits caused by THC.</p>
<p>To address this gap in knowledge, a team of researchers led by Justin C. Strickland at the Johns Hopkins University School of Medicine conducted a controlled investigation. They sought to evaluate the isolated and combined effects of these substances. The researchers aimed to simulate real-world usage patterns to understand the potential risks facing consumers who purchase these pre-mixed cocktails.</p>
<p>The study recruited twenty healthy adults to participate in the experiment. The group was evenly divided between ten men and ten women. All participants had a history of using cannabis and caffeine, ensuring they were familiar with the substances, but they were not heavy or daily users. This selection criteria helped ensure the results would apply to the general population of casual consumers rather than just chronic users with high tolerance.</p>
<p>The investigation utilized a double-blind, randomized, placebo-controlled crossover design. This means that neither the participants nor the research staff knew which drug combination was being administered during a given session. Additionally, every participant underwent all four experimental conditions at different times. This rigorous structure allowed the researchers to compare an individual’s reaction to the drug mixtures against their own baseline, reducing the influence of personal biological differences.</p>
<p>The four conditions included a placebo session, a session with only THC, a session with THC and caffeine, and a final session combining THC, caffeine, and CBD. The researchers chose to use a cumulative dosing schedule rather than a single large dose. Participants received three separate doses administered over a two-hour period. This method was designed to model how a consumer might slowly drink a cannabis-infused beverage at a social event.</p>
<p>The total dosages administered were 7.5 milligrams of THC, 180 milligrams of caffeine, and 105 milligrams of CBD. These amounts were selected to reflect the potency of products currently available on retail shelves. Throughout the sessions, the researchers collected a wide array of data. They measured subjective feelings, such as how “high” or anxious a participant felt. They also tracked physiological signs like heart rate and blood pressure.</p>
<p>Performance impairment was another key metric. The researchers tested reaction time, memory, and balance. They also utilized a sophisticated driving simulator to assess whether the participants could safely operate a vehicle. Finally, the team collected blood samples to analyze the pharmacokinetics of the drugs. Pharmacokinetics refers to how a drug is absorbed, distributed, metabolized, and excreted by the body.</p>
<p>The results regarding the combination of just THC and caffeine were somewhat unexpected given previous animal models. The researchers found that adding caffeine to THC did not substantially change the subjective experience of being high. Participants did not report feeling more intoxicated, nor did they feel significantly more alert compared to when they took THC alone. The metabolic data confirmed this observation. Caffeine did not alter the concentration of THC in the blood.</p>
<p>However, the team did observe a subtle but potentially risky signal regarding decision-making. When participants consumed the THC and caffeine mixture, they expressed a slightly higher willingness to drive compared to when they consumed THC alone. This increased confidence occurred despite the fact that their actual driving performance on the simulator remained impaired. This disconnect suggests that caffeine might mask feelings of sedation without restoring the physical and mental skills necessary for safety.</p>
<p>The findings shifted dramatically when CBD was introduced to the combination. When participants consumed the mixture of THC, caffeine, and CBD, they reported stronger subjective effects. They felt “higher” and experienced more anxiety than they did with THC alone. Objective tests mirrored these reports. Performance on tasks measuring cognitive and motor skills degraded further in the triple-combination condition.</p>
<p>The blood analysis provided a biological mechanism for these intensified effects. The researchers discovered that the presence of CBD led to higher concentrations of THC in the blood plasma. Additionally, levels of 11-OH-THC were elevated. 11-OH-THC is a potent metabolite produced when the liver breaks down THC. It is known to be more psychoactive than THC itself and readily crosses the blood-brain barrier.</p>
<p>This pharmacokinetic interaction suggests that CBD inhibits the enzymes in the gut or liver that usually break down THC. By slowing this breakdown process, CBD essentially increases the bioavailability of the psychoactive compound. Consequently, a consumer drinking a beverage with both CBD and THC might absorb a higher effective dose of THC than if they had consumed the THC product in isolation.</p>
<p>The study has several implications for consumers and regulators. It challenges the common marketing narrative that CBD is a purely non-intoxicating substance that can “mellow out” the effects of THC. In the context of oral consumption, CBD appears to act as a potentiator. It makes the psychoactive experience stronger and potentially more disorienting.</p>
<p>There are important caveats to consider when interpreting these results. The study focused on a specific ratio of cannabinoids and caffeine. It is possible that different formulations could produce different outcomes. The sample size of twenty people is relatively small, although the within-subject design strengthens the statistical validity. Furthermore, the study only examined oral administration. The metabolic interactions observed here might not occur if the products were inhaled, as smoking or vaping bypasses the digestive system’s first-pass metabolism.</p>
<p>Future research is needed to explore a wider range of doses. Understanding whether lower amounts of CBD still trigger this metabolic interaction is essential. The researchers also highlight the need to investigate other common additives found in these beverages, such as vitamins or taurine. As the market for cannabis beverages expands, more granular data will be necessary to inform safety guidelines.</p>
<p>The authors note that these interactions should be a factor in regulatory decision-making. Current labels often list the milligrams of THC and CBD separately. However, this study indicates that the numbers do not tell the whole story. The biological interaction between the ingredients alters the final effect on the user.</p>
<p>In conclusion, the researchers state regarding the CBD interaction: “The robust alteration of Δ9-THC-induced effects and Δ9-THC pharmacokinetics by CBD further emphasizes the importance of considering full cannabinoid profiles. Broadly, these data highlight the importance of considering drug combinations and interactions in future cannabis regulatory decision-making.”</p>
<p>The study, “<a href="https://doi.org/10.1038/s41386-025-02232-x" target="_blank">Effect of caffeine and cannabidiol (CBD) co-administration on Δ9-tetrahydrocannabinol (Δ9-THC) subjective effects, performance impairment, and pharmacokinetics</a>,” was authored by Justin C. Strickland, Hayleigh E. Tilton, Noah M. Patton, Ryan Vandrey, C. Austin Zamarripa, Tory R. Spindle, Dustin C. Lee, Cecilia L. Bergeria, David Wolinsky, Jost Klawitter, Cristina Sempio, Jorge Campos-Palomino, Uwe Christians, Matthew T. Feldner, Jessica G. Irons and Marcel O. Bonn-Miller.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/mortality-rates-increase-in-u-s-counties-that-vote-for-losing-presidential-candidates/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Mortality rates increase in U.S. counties that vote for losing presidential candidates</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 12th 2026, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>An analysis of U.S. statistical data found that, after elections in which political leadership changed, mortality rates increased in counties that supported the losing presidential candidate compared to counties that supported the winning candidate. The increase was on average 7 deaths per 100,000 people. The research was published in <a href="https://doi.org/10.1371/journal.pone.0334507"><em>PLoS One</em></a>.</p>
<p>Recent years have seen an increase in political polarization in the United States. Hostility towards political opponents has increased substantially. A 2016 study reported that both Democrats and Republicans claim that the other faction generates “very unfavorable” feelings, such as frustration, fear, and anger.</p>
<p>One consequence of this polarization is that communities are becoming increasingly homogeneous politically. Political minorities living within these communities are found to be apprehensive about disclosing their political preferences and are more likely to be subject to discrimination and social isolation.</p>
<p>Political views also affect expectations about future economic outcomes. Individuals tend to be more optimistic about the economy when they are affiliated with the party that controls the government. Studies indicate that this tendency has increased during the last decade.</p>
<p>Study author Sris Chatterjee and his colleagues wanted to investigate whether a shift in political leadership and the lack of political representation can affect health outcomes. They note that holding political opinions that differ from those of the ruling party can increase stress, anxiety, and the feeling of social isolation. They reason that this might have adverse effects on physical and mental health.</p>
<p>The study authors used county-level mortality rates as an indicator of health. They combined these mortality rates with data on voting in Presidential elections.</p>
<p>For each county, the researchers calculated the share of voters who voted for the Democratic candidate and the share who voted for the Republican candidate. If a county primarily voted for a candidate who did not win the election, it was considered to have experienced an electoral loss.</p>
<p>The authors focused on elections in which Presidents Obama and Trump (the first time) were elected. These were elections in which political leadership changed, meaning that the elected president was not from the same party as his predecessor. President Obama was elected after a Republican president (Bush), while President Trump was elected after a Democratic president (Obama).</p>
<p>The researchers considered the total number of membership associations per 10,000 people in a county as a measure of individuals’ social involvement in the community. They also used data from two surveys that asked about health to produce additional measures of county-level health.</p>
<p>Results showed that mortality rate trends in counties supporting the winning candidate and those supporting the losing candidates diverged after the elections that resulted in political change. More specifically, losing counties experienced, on average, an increase in the age-adjusted mortality rate of 7 deaths per 100,000 population.</p>
<p>Previous studies cited by the authors found that a one percentage point increase in unemployment results in a decrease of the mortality rate by 4.6 deaths per 100,000, while the mortality rate decreased by 18 individuals per 100,000 during the Great Recession (2007-2009).</p>
<p>Thus, in terms of magnitude, counties supporting the losing candidate experienced an increase in mortality roughly equivalent to the absolute change found when unemployment increases by 1.5 percentage points. The increase was a bit less than half the magnitude of the change that occurred during the Great Recession.</p>
<p>“Evidence suggests that political sentiments and social isolation may be important factors underlying our findings. Indeed, we document after turnover Presidential elections a decrease in the degree of social interactions in the losing counties. Also, we identify a further increase in sudden causes of death around crucial political events and a worsening of the mental health of the individuals,” the study authors concluded.</p>
<p>The study contributes to the scientific understanding of the links between political processes and health. However, the authors note that they were not able to estimate how much an increase in sudden deaths contributed to the observed increases in mortality rates and could not exclude the possibility that mechanisms other than social isolation and political sentiments might play a role in the observed increases in mortality.</p>
<p>The paper, “<a href="https://doi.org/10.1371/journal.pone.0334507">The health costs of losing political representation: Evidence from U.S. Presidential Elections,</a>” was authored by Sris Chatterjee, Iftekhar Hasan, and Stefano Manfredonia.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/genetic-testing-might-help-doctors-avoid-antidepressants-with-negative-interactions/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Genetic testing might help doctors avoid antidepressants with negative interactions</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 12th 2026, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A large-scale clinical trial conducted within the Department of Veterans Affairs indicates that analyzing a patient’s genetic makeup can assist medical providers in avoiding antidepressants that may be difficult for the body to process. Patients who underwent this pharmacogenomic testing were more likely to be prescribed medications with fewer predicted negative interactions. </p>
<p>Additionally, these patients experienced a modest but statistically significant improvement in the remission of their depression symptoms compared to those receiving standard care. The findings of this research were published in the <em><a href="https://jamanetwork.com/journals/jama/fullarticle/2794053" target="_blank">Journal of the American Medical Association</a></em>.</p>
<p>Major depressive disorder is a pervasive and debilitating health condition that affects millions of adults. Symptoms often include persistent sadness, loss of interest in activities, insomnia, changes in appetite, and in severe cases, thoughts of suicide. Finding the right medication to manage these symptoms is frequently a challenge. </p>
<p>Current clinical practice largely relies on a trial-and-error approach, where a patient tries a medication for several weeks to see if it works. If the first drug fails or causes intolerable side effects, the patient and doctor must start over with a new prescription.</p>
<p>This iterative process can be discouraging and prolongs the period of suffering for the patient. Consequently, medical science has sought ways to personalize this process. Pharmacogenomics is the study of how a person’s genes affect their body’s response to drugs. </p>
<p>While this type of testing is increasingly common in treating cancer and heart disease, its application in psychiatry has been a subject of ongoing debate. The goal is to use a patient’s genetic profile to predict how they will metabolize specific drugs, thereby reducing the guesswork involved in prescribing.</p>
<p>The researchers behind this study sought to determine if providing clinicians with immediate access to pharmacogenomic data would lead to better medication choices and improved patient outcomes in a real-world setting. Previous research on this topic has been limited or produced mixed results, leaving providers unsure about the clinical utility of these tests. </p>
<p>The study team aimed to move beyond theoretical benefits and assess whether this technology actually helps patients recover from depression more effectively than usual care.</p>
<p>“Treating Veterans and other patients with major depressive disorder can prove challenging. There are many medications a physician can choose from, but patients respond differently to these medicines,” said David Oslin, director of VA’s VISN 4 Mental Illness, Research, Education, and Clinical Center (MIRECC), who led the study.</p>
<p>“Achieving a remission can take months as clinicians use a trial-and-error process to identify an effective medication. We need a better way of targeting treatments. There is a lot of promise in how genetics might help in selecting medications. Genetic tests are commercially available, but there was only limited evidence on how they would work in clinical practice. Our research aimed to change this.”</p>
<p>To investigate this, the research team recruited nearly 2,000 veterans diagnosed with major depressive disorder. The study took place across 22 Veterans Affairs medical centers, ensuring a diverse range of clinical settings. </p>
<p>The participants were patients who were either initiating a new treatment for depression or switching medications due to lack of success with a previous drug. The researchers employed a randomized method to divide the participants into two groups.</p>
<p>The first group, referred to as the pharmacogenomic-guided group, received genetic testing immediately. Their doctors were given the results to help inform their prescribing decisions. The second group served as the control and received usual care. </p>
<p>These patients also underwent genetic testing, but the results were not shared with their providers for 24 weeks. This design allowed the researchers to compare the outcomes of genetically informed prescribing against standard clinical judgment.</p>
<p>The testing process itself was non-invasive. Patients provided a DNA sample using a simple cheek swab. The researchers utilized a commercial genetic test panel that analyzes variants in genes encoding cytochrome P450 enzymes. These liver enzymes are responsible for metabolizing many common medications. </p>
<p>The test results categorized antidepressants based on how the patient’s specific genetic profile would likely interact with them. Medications were labeled as having no predicted interaction, moderate interaction, or substantial interaction.</p>
<p>The results of the study provided evidence that access to genetic information altered how doctors prescribed medications. In the group with access to test results, there was a marked shift away from drugs that had predicted negative interactions. </p>
<p>Specifically, 59 percent of patients in the guided group received a medication with no predicted drug-gene interaction. In contrast, only 26 percent of patients in the usual care group received a medication with no predicted interaction.</p>
<p>The researchers also observed that patients in the guided group were far less likely to be prescribed a drug with a “substantial” interaction risk. This suggests that without the genetic data, clinicians frequently prescribe medications that a patient’s body may struggle to process efficiently. The study highlights that having this biological information empowers providers to make more precise decisions regarding dosage and drug selection.</p>
<p>Regarding clinical improvement, the study measured outcomes using standard depression severity scales over a period of 24 weeks. The researchers found that the group receiving genetically guided care showed better rates of symptom remission and response. </p>
<p>The benefit was most notable during the earlier phases of treatment, specifically at the 8-week and 12-week check-ins. This indicates that using genetic insights may speed up the process of finding an effective treatment.</p>
<p>“Pharmacogenomics, understanding how a person’s genetics affect the response to medications, can assist clinicians in getting Veterans the care they need sooner than through trial and error,” Oslin told PsyPost. “Genetic testing can identify a small number of people for whom selecting an alternative antidepressant will lead to faster treatment outcomes.”</p>
<p>However, the difference between the two groups narrowed by the end of the six-month study period. At the 24-week mark, the statistical difference in remission rates was no longer significant.</p>
<p>This convergence suggests that the usual care group eventually began to catch up, likely because their doctors adjusted medications based on the patient’s clinical response over time. The genetic testing appeared to act as a shortcut, helping patients reach a better therapeutic state faster than they would have through standard trial and error.</p>
<p>Oslin noted that the results were not a “slam dunk” for every single patient but offered a clear benefit for some. He pointed out that only about 15 to 20 percent of patients possess the specific genetic variants that would significantly interfere with standard medications. </p>
<p>For the remaining majority, the test might not prompt a change in prescription. Nevertheless, for the minority of patients with these variants, avoiding a mismatched drug can be quite meaningful.</p>
<p>“A relatively small number of patients with major depressive disorder will benefit from genetic testing,” Oslin explained. “But when results indicate an alternative medicine for a patient is better suited, the effect on that patient can be substantial. Because the cost of testing is low and tests have to be done just once in a patient’s lifetime, we believe the benefits outweigh the costs.”</p>
<p>The study also revealed that patients suffering from post-traumatic stress disorder (PTSD) alongside their depression had a harder time achieving remission. This was true regardless of whether they received genetic testing or not. </p>
<p>The presence of PTSD appeared to be a strong factor in treatment resistance, suggesting that comorbidities play a significant role in how well a patient responds to antidepressant therapy.</p>
<p>There are some caveats to how these results should be interpreted. A common misunderstanding is that pharmacogenomic testing tells a doctor which drug will cure the patient’s depression. “The tests don’t do that,” Oslin noted. “Instead, they tell providers about the metabolism of the medication itself, not about the patient’s depression or anxiety.”</p>
<p>In other words, pharmacogenomic testing can indicate if a patient will break down a medication too quickly, rendering it ineffective, or too slowly, causing it to build up to toxic levels.</p>
<p>The authors also acknowledged limitations in the study design. The trial was not blinded, meaning both the doctors and the patients knew who had access to the test results. This awareness introduces the possibility of a placebo effect, where patients feel better simply because they know they are receiving a technologically advanced, personalized treatment.</p>
<p>Despite these limitations, the study suggests that pharmacogenomic testing carries a low risk and offers potential benefits. The burden on the patient is minimal, involving only a cheek swab, and the cost is relatively low considering the results remain valid for the patient’s lifetime. </p>
<p>Future research will likely focus on identifying which specific subgroups of patients stand to gain the most from this testing, allowing for even more targeted application of the technology.</p>
<p>“Our work reinforced once again the critical role of continuing professional education and ongoing training particularly as new techniques emerge,” Oslin said. “Bringing the insights of research into clinical practice entails working closely with providers to ensure they understand the results and implications of pharmacogenomic testing as they seek to care for Veterans struggling with major depressive disorder.”</p>
<p>The study, “<a href="https://jamanetwork.com/journals/jama/fullarticle/2794053" target="_blank">Effect of Pharmacogenomic Testing for Drug-Gene Interactions on Medication Selection and Remission of Symptoms in Major Depressive Disorder: The PRIME Care Randomized Clinical Trial</a>,” was authored by David W. Oslin, Kevin G. Lynch, Mei-Chiung Shih, Erin P. Ingram, Laura O. Wray, Sara R. Chapman, Henry R. Kranzler, Joel Gelernter, Jeffrey M. Pyne, Annjanette Stone, Scott L. DuVall, Lisa Soleymani Lehmann, and Michael E. Thase.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>