<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/psychotic-delusions-are-evolving-to-incorporate-smartphones-and-social-media-algorithms/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Psychotic delusions are evolving to incorporate smartphones and social media algorithms</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 30th 2025, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research analyzing medical records from a psychiatric treatment program suggests that the content of psychotic delusions is increasingly incorporating themes related to the internet and modern technology. The study provides evidence that the prevalence of these technology-focused false beliefs rose significantly over an eight-year period. The findings were published in <em><a href="https://doi.org/10.1192/bjp.2025.10452" target="_blank" rel="noopener">The British Journal of Psychiatry</a>.</em></p>
<p>Psychiatry has long distinguished between the form of a delusion and its specific content. The form refers to the structural category of the belief, such as paranoia, grandiosity, or the sense that one is being controlled by external forces.</p>
<p>Research indicates that these forms tend to remain relatively consistent across different cultures and historical eras. However, the specific narratives that fill these forms are often shaped by the surrounding environment and the sociopolitical climate of the time.</p>
<p>Historical data illustrates this adaptability of the human mind during psychosis. In the early 20th century, patients frequently harbored delusions regarding syphilis. During the Second World War, fears often centered on enemy soldiers. By the Cold War, the thematic content shifted toward spies, communists, and nuclear threats.</p>
<p>As technology evolved, so did the explanations for strange experiences. The widespread adoption of radio and television saw a rise in patients believing these devices were transmitting thoughts or controlling their actions.</p>
<p>The rapid advancement of digital technology in the last three decades has created a new landscape for human interaction. The internet, smartphones, and social media have fundamentally altered how individuals perceive space, privacy, and communication.</p>
<p>Given this shift, the authors of the current study sought to examine the extent to which these modern tools have infiltrated the delusional frameworks of patients today. They aimed to quantify the prevalence of such beliefs and determine if they are becoming more frequent as technology becomes more ubiquitous.</p>
<p>“For many years I have worked closely with patients with psychotic disorders, and over time I came to appreciate more and more the extent to which technology was incorporated into delusional frameworks,” said study author Alaina Vandervoort Burns, an assistant clinical professor at the UCLA-Semel Institute for Neuroscience and Human Behavior.</p>
<p>“In order to comprehensively evaluate my patients’ delusions I had to ask specific questions about technology. I realized that education around the evaluation of delusional thought content was not up to date, and I hope to increase psychiatrists’ awareness of how common technology delusions are so we can properly evaluate and treat our patients.”</p>
<p>“Additionally, given how rapidly technology has advanced, things that seemed impossible just a few decades ago, or even a few years ago, are now possible. This has made it harder to determine what is ‘delusional,’ as my patients often describe things that are very much on the blurred line of what is considered reality-based and what is considered psychotic. To me it’s just so interesting to think about.”</p>
<p>To investigate this, the researchers focused on a specific cohort of adults. They utilized data from the Thought Disorders Intensive Outpatient Program at the University of California, Los Angeles.</p>
<p>This program serves adults with psychotic disorders, primarily schizophrenia and schizoaffective disorder. The participants in this sample were generally stable enough to attend group therapy and were not actively using drugs or alcohol at the time of treatment.</p>
<p>The team conducted a retrospective review of medical records for 228 patients who were enrolled in the program between December 2016 and May 2024. They manually examined initial psychiatric assessments and weekly progress notes to identify descriptions of delusional thought content. Using qualitative analysis software, the researchers categorized these delusions into standard subtypes. They also specifically coded for any mention of new technologies.</p>
<p>The definition of technology delusions in this study was broad. It included references to the internet, Wi-Fi networks, and mobile devices. It also encompassed beliefs about hacking, surveillance through hidden electronics, and social media interactions. Additionally, the researchers looked for instances of “The Truman Show” delusion, where a person believes their life is being filmed and broadcast for entertainment.</p>
<p>The analysis revealed that delusional thinking was a prominent feature in this group. Approximately 88 percent of the subjects reported experiencing delusions during their treatment. Among those who experienced delusions, slightly more than half incorporated technology into their beliefs. This suggests that digital themes have become a major component of modern psychosis.</p>
<p>The most frequent technological theme involved the compromise of personal devices. Forty subjects expressed the belief that their computers, phones, or internet connections had been hacked.</p>
<p>The specific manifestations varied. One patient believed spyware had been installed on his phone. Another felt that static on a phone line was evidence of someone listening to her conversations. Paranoia regarding Wi-Fi routers was also observed, with some patients believing neighbors were tampering with their internet connections.</p>
<p>Social media platforms featured in the delusions of about one-quarter of the group. Instagram was the most commonly cited platform, followed by YouTube, Facebook, and X (formerly Twitter).</p>
<p>Patients described a variety of referential beliefs. Some felt that posts on these platforms contained encoded messages meant specifically for them. Others believed they were communicating directly with celebrities through these apps. One subject reported that YouTube videos would appear with titles matching their exact thoughts.</p>
<p>Surveillance through hidden equipment was another common source of distress. Twenty-one subjects believed that cameras or microphones were concealed in their environment. These fears often extended to the structure of their homes, with patients suspecting devices were behind walls or in ceilings. Some participants believed that microchips or tracking devices had been implanted in their bodies.</p>
<p>The researchers also identified eleven subjects who experienced “The Truman Show” delusion. These individuals believed they were the central characters in a staged reality. One man believed his parents had replaced the lamps in his home with cameras. Another described feeling like a virtual pet in an aquarium, constantly observed by an outside audience.</p>
<p>To measure trends over time, the researchers utilized a binary logistic regression analysis. This statistical method allowed them to determine if the year of admission predicted the presence of technology delusions.</p>
<p>The results showed a significant positive correlation. For every one-year increase in the admission date, the odds of a patient presenting with technology-related delusions increased by approximately 15 percent.</p>
<p>“The content of delusional thoughts changes with the times, and technology delusions have become more frequent,” Burns told PsyPost. “Social media in particular can be tricky to navigate for someone who is struggling with psychosis.”</p>
<p>The study also examined whether demographic factors influenced these symptoms. The researchers looked at gender, age, education level, and history of substance use.</p>
<p>The analysis found no significant association between these variables and the presence of technology delusions. This finding contrasts with the initial hypothesis that younger people, who are often considered digital natives, would be more likely to experience these themes. The data suggests that technology delusions are pervasive across different age groups within this population.</p>
<p>“I was surprised that younger people did not have a higher likelihood of experiencing technology delusions,” Burns said. “In our study, age was not significantly associated with the presence of a technology delusion.”</p>
<p>The authors noted that the distinction between reality and delusion can be complicated by the actual capabilities of modern technology. Real-world algorithms do track user behavior to serve targeted content. This can sometimes mimic the experience of having one’s mind read or being watched.</p>
<p>For individuals with psychosis, these legitimate privacy concerns can spiral into fixed, false beliefs. The study highlights that beliefs once considered bizarre, such as being monitored through a phone, are now technically feasible.</p>
<p>As with all research, there are limitations. The research relied on retrospective data extracted from medical notes rather than standardized interviews. This means that the recorded incidence of these symptoms depends on what clinicians chose to document. It is possible that treatment teams simply became more likely to ask about technology in later years.</p>
<p>Additionally, the sample consisted of individuals who were housed and insured, which may not represent the full spectrum of people living with psychotic disorders.</p>
<p>Future research in this area is expected to address the emergence of artificial intelligence. As the boundary between the digital and physical worlds continues to blur, the content of delusions will likely continue to evolve.</p>
<p>“The data for this study was collected before artificial intelligence was widely available, and I think it’s going to be fascinating to see how AI interfaces with psychosis moving forward,” Burns said. “I imagine we will be seeing a lot of patients seeking treatment for AI-associated psychosis.”</p>
<p>The study, “<a href="https://doi.org/10.1192/bjp.2025.10452" target="_blank" rel="noopener">‘The algorithm is hacked’: analysis of technology delusions in a modern-day cohort</a>,” was authored by Alaina V. Burns, Kyle Nelson, Haley Wang, Erin M. Hegarty and Alexander B. Cohn.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-high-fat-diet-severs-the-chemical-link-between-gut-and-brain/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A high-fat diet severs the chemical link between gut and brain</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 29th 2025, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new comprehensive analysis reveals that chronic consumption of fat-rich foods triggers a specific chemical imbalance that disrupts communication between the digestive system and the brain. Published in <em><a href="https://doi.org/10.1080/1028415x.2025.2539320" target="_blank">Nutritional Neuroscience</a></em>, the study details how these dietary habits elevate serotonin levels in the gut while paradoxically depleting this vital chemical in brain regions responsible for mood and memory. This biological disconnection provides a potential explanation for the link between obesity, depression, and cognitive decline.</p>
<p>Serotonin functions as a chemical messenger with distinct roles depending on its location in the body. Roughly ninety-five percent of this molecule resides in the gastrointestinal tract, where it manages digestion and blood flow. The remaining small fraction operates within the central nervous system to regulate appetite, emotions, and learning. These two systems maintain a constant dialogue through a network known as the gut-brain axis.</p>
<p>The researchers sought to understand the specific biological pathways that degrade this communication during periods of poor nutrition. While previous observations linked greasy foods to health issues, the molecular steps connecting what we eat to how we feel remained unclear. </p>
<p>To clarify these mechanisms, a team led by Taylor Gray and Jian Han at North Carolina Agricultural and Technical State University collaborated with researchers from Brown University and Cornell University. They examined a wide breadth of existing literature to map the chemical trajectory of serotonin under dietary stress.</p>
<p>The investigation begins in the digestive tract, where specialized cells called enterochromaffin cells manufacture the vast majority of the body’s serotonin. The researchers report that a high-fat diet forces these cells to overproduce the chemical. This occurs because the diet stimulates the enzymes responsible for initiating synthesis.</p>
<p>Simultaneously, the cellular machinery designed to recycle serotonin malfunctions. Under normal circumstances, a transporter protein acts like a vacuum to clear used serotonin from the system. The study indicates that fatty foods suppress the production of this transporter. This double blow of increased production and decreased cleanup causes serotonin to accumulate rapidly in the gut.</p>
<p>This local surplus creates a toxic environment in the digestive system. The review explains that excess serotonin stimulates immune cells to release inflammatory signals. It also compromises the lining of the intestines. This loss of integrity leads to permeability issues often described as a “leaky gut,” allowing harmful substances to enter the bloodstream.</p>
<p>While the gut is flooded with serotonin, the situation in the brain presents a stark and problematic contrast. The authors detail how fatty foods deprive the hippocampus of necessary serotonin. This brain region governs memory formation and emotional stability.</p>
<p>The depletion in the hippocampus occurs through a different mechanism than in the gut. A high-fat diet appears to accelerate the activity of an enzyme called monoamine oxidase A. This enzyme acts as a waste disposal unit that breaks down neurotransmitters. When it becomes overactive, it destroys serotonin before the brain can utilize it for stabilizing mood or encoding memories.</p>
<p>Similar shortages appear in the hypothalamus, the brain’s control center for hunger and metabolism. Under normal conditions, serotonin helps signal when the body is full. The review explains that a high-fat diet disrupts the receptors that receive these satiety signals.</p>
<p>The study highlights that specific receptors in the hypothalamus, particularly the 5-HT1A subtype, become more abundant but less effective in their signaling roles. This alteration dampens the cellular pathways usually activated by serotonin. The result is a weakened ability to regulate energy balance and interpret fullness. This chemical blockage creates a cycle of overeating and metabolic dysfunction.</p>
<p>One of the most complex findings involves the raphe nuclei, a cluster of neurons that acts as the brain’s primary serotonin factory. The researchers found that a high-fat diet actually increases the capacity for serotonin synthesis in this specific area. This finding would seem to contradict the low levels found elsewhere in the brain.</p>
<p>However, the authors describe a “bottleneck” effect that negates this increased production. The diet triggers autoreceptors on the surface of these neurons that act like a shut-off valve. When these sensors detect the rising serotonin production locally, they inhibit the neurons from firing.</p>
<p>This inhibition prevents the release of serotonin to downstream targets. Consequently, even though the raphe nuclei are producing plenty of serotonin, the delivery trucks are effectively blocked from leaving the warehouse. This results in the observed deficits in the hippocampus and hypothalamus.</p>
<p>The authors identify the gut microbiome as the likely mediator of this widespread dysfunction. Healthy bacteria ferment dietary fiber to produce short-chain fatty acids. These fatty acids usually protect the brain and help regulate the enzymes involved in serotonin production.</p>
<p>A diet rich in fat typically lacks fiber, which starves these beneficial bacteria. The subsequent drop in short-chain fatty acids removes a critical layer of neuroprotection. The study notes that acetate and butyrate, two specific fatty acids, are essential for maintaining the proper sensitivity of serotonin receptors.</p>
<p>The loss of beneficial bacteria also contributes to systemic inflammation. The bacterial imbalance triggers an immune response that releases molecules called cytokines. These inflammatory messengers travel through the blood and can penetrate the protective barrier of the brain.</p>
<p>Once inside the central nervous system, cytokines hijack the chemical assembly line that typically produces serotonin. They activate an enzyme that diverts tryptophan, the raw material for serotonin, down a different metabolic path. Instead of creating the mood-regulating chemical, the brain is forced to produce compounds that can damage neurons.</p>
<p>This process, known as the kynurenine pathway, further depletes the available resources for serotonin synthesis. The combination of diverted raw materials and blocked release pathways creates a profound deficit in central serotonin. This deficit manifests as the behavioral and cognitive issues often associated with poor diet and obesity.</p>
<p>The study also points to hormonal disruptions that exacerbate this cycle. Hormones such as leptin and ghrelin normally work in concert with serotonin to manage appetite. A high-fat diet alters the levels of these hormones, creating a feedback loop that further suppresses serotonin signaling.</p>
<p>Cortisol, the body’s primary stress hormone, also plays a significant role in this cascade. The researchers note that high-fat diets elevate circulating cortisol levels. This hormone can cross into the brain and directly increase the activity of the enzymes that break down serotonin.</p>
<p>The cumulative effect of these changes is a system where the gut is inflamed and overactive, while the brain is starved of chemical regulation. The authors suggest that this imbalance is not merely a symptom of obesity but a driving factor in its persistence. The loss of serotonin-mediated satiety control makes it increasingly difficult to stop overeating.</p>
<p>Simultaneously, the reduction in hippocampal serotonin compromises mental resilience. This leaves the individual more vulnerable to stress and depression. These emotional states often drive further comfort eating, reinforcing the dietary habits that cause the damage.</p>
<p>The authors note that much of the current understanding relies on data from rodent models. While these animal studies provide essential insights into molecular pathways, human biology may respond with variations. Future clinical research must verify if these specific receptor changes occur identically in people.</p>
<p>The review suggests that restoring balance to the gut microbiome offers a promising avenue for treatment. Replenishing short-chain fatty acids could potentially bypass the damage caused by dietary fat. Strategies to reduce inflammation might also help unlock the “bottleneck” in the raphe nuclei.</p>
<p>The researchers emphasize that understanding these specific pathways is the first step toward new therapies. By mapping the precise receptors and enzymes involved, scientists can develop targeted interventions. These treatments could eventually help manage the mood disorders and cognitive impairments that frequently accompany metabolic disease.</p>
<p>The study, “<a href="https://doi.org/10.1080/1028415x.2025.2539320" target="_blank">Exploring the impact of a high-fat diet on the serotonin signaling in gut-brain axis</a>,” was authored by Taylor Gray, Yewande O. Fasina, Scott H. Harrison, Evelyn M. Chang, Alex Y. Chang, Antoinette Maldonado-Devincci and Jian Han.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/oxytocin-boosts-creativity-but-only-for-approach-oriented-people/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Oxytocin boosts creativity, but only for approach-oriented people</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 29th 2025, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1093/scan/nsaf004" target="_blank">Social Cognitive and Affective Neuroscience</a></em> has found that oxytocin, often called the “love hormone,” can make people more creative—but only if they are naturally inclined to seek out rewards rather than avoid risks.</p>
<p>The ability to generate new and useful ideas, known as creativity, is vital for problem-solving and innovation. Scientists have long known that oxytocin plays a role in trust and social behavior, but its impact on higher-level thinking has remained unclear. Earlier research suggested oxytocin might encourage flexible thinking, but results were inconsistent, possibly because individual differences—especially regarding personality—were overlooked.</p>
<p>Led by Chen Yang, a team from Central China Normal University focused on two motivational styles: approach-oriented individuals, who are driven by rewards and opportunities, and avoidance-oriented individuals, who prioritize safety to minimize mistakes and avoid punishment. These tendencies shape how people think and respond to challenges.</p>
<p>The researchers conducted two experiments involving over 120 male college students. In the first, participants were grouped based on their natural motivation style. In the second, motivation was temporarily induced through a memory exercise. All participants completed a creativity test called the Alternative Uses Task, which asks people to think of unusual uses for everyday objects like a spoon or umbrella. Each person received either a nasal spray of oxytocin or a placebo.</p>
<p>The results were striking. Oxytocin significantly boosted creativity—measured by originality (novelty) and flexibility of ideas—in approach-oriented participants. Those who were avoidance-oriented showed no improvement.</p>
<p>Yang and colleagues also scrutinized brain network activity and discovered that the Default Mode Network and the Executive Control Network—which are important for creativity—worked together much more actively in the approach-oriented group.</p>
<p>“In contrast, avoidance-oriented individuals displayed no significant alterations in connectivity patterns,” the authors added.</p>
<p>Brain scan analyses also revealed more efficient brain communication in the approach-oriented group following oxytocin administration. The researchers noted “[an] increase in global efficiency and a concurrent reduction in the shortest path length within their brain networks, while the avoidance group displayed no substantial alterations.”</p>
<p>“These alterations effectively shortened information conduction pathways within networks, expedited transmission rates, and enhanced the efficiency of processing and integrating semantic concepts, which was conducive to the generation of creative thinking.”</p>
<p>Interestingly, the hormone did not affect mood, ruling out the possibility that happier feelings drove the creative boost. Instead, oxytocin seemed to enhance cognitive flexibility—the ability to shift perspectives and think broadly—especially in those already inclined toward exploration.</p>
<p>Notably, the research has limits. It only included men, used a single dose of oxytocin, and measured creativity through one type of task.</p>
<p>The study, “<a href="https://doi.org/10.1093/scan/nsaf004" target="_blank">Oxytocin enhances creativity specifically in approach-motivated individuals</a>,” was authored by Chen Yang, Zhaoyang Guo, and Liang Cheng.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/brain-folding-patterns-may-predict-adhd-treatment-success-in-adults/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Brain folding patterns may predict ADHD treatment success in adults</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 29th 2025, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>An experimental study of individuals with ADHD revealed that those with increased gyrification in frontal cortical regions of the brain generally responded better to treatment. The treatment was either group psychotherapy or methylphenidate in combination with clinical management of symptoms. </p>
<p>However, neither group psychotherapy nor methylphenidate was more effective overall than the control conditions. The paper was published in <a href="https://doi.org/10.1038/s41398-025-03681-0"><em>Translational Psychiatry</em></a>.</p>
<p>Cortical gyrification is the process by which the brain’s cerebral cortex folds into ridges (gyri) and grooves (sulci) during development, increasing its surface area without greatly expanding skull size.</p>
<p>These folds allow a much larger number of neurons and connections to fit within the limited space of the cranium. Gyrification begins prenatally and continues into early childhood as different brain regions mature at different rates.</p>
<p>Higher degrees of gyrification are generally associated with increased cognitive capacity because more cortical surface supports more complex neural processing.</p>
<p>However, the pattern of folds, not just their number, is important for efficient connectivity between brain regions. Abnormal gyrification—either too little or too much—has been linked to neurodevelopmental conditions such as autism, schizophrenia, and certain genetic disorders.</p>
<p>Study author Jonathan Laatsch and his colleagues wanted to explore whether the effects of treatment for attention-deficit/hyperactivity disorder (ADHD) symptoms depend on the degree of cortical gyrification in adults suffering from this disorder. Their general expectation was that individuals with higher levels of cortical gyrification would show a stronger response to treatments (i.e., stronger reductions in symptoms).</p>
<p>They conducted an experimental study (randomized controlled trial). While the parent study included 419 adults suffering from ADHD, the final sample for this specific neuroimaging analysis consisted of 121 participants. Their ages ranged between 19 and 58 years, with the average being 35 years. The number of males and females was roughly equal.</p>
<p>Study participants were randomly divided into 4 treatment groups for different 12-week treatments. The first group was to undergo group psychotherapy while taking methylphenidate. The second group was also undergoing group psychotherapy, but received placebo medications (capsules looking exactly like methylphenidate capsules, but with no active ingredients). The third group underwent clinical management of symptoms and received methylphenidate, while the 4<sup>th</sup> group received clinical management of symptoms and placebo.</p>
<p>Group psychotherapy consisted of weekly sessions for 12 weeks and then monthly for 10 additional sessions. Clinical management was an active control condition that simulated routine psychiatric care consisting of nondirective supportive counseling on the basis of individual sessions lasting 15–20 minutes.</p>
<p>Methylphenidate is a stimulant medication used to improve attention, focus, and impulse control. It is commonly prescribed for ADHD. For participants receiving this medication, the dose was gradually increased over 6 weeks until 60 mg/day was reached. Participants did not know whether they were receiving methylphenidate or placebo, but were aware whether they were receiving group psychotherapy or clinical management.</p>
<p>Participants underwent magnetic resonance imaging (MRI) of their brains, allowing study authors to calculate the level of cortical gyrification. ADHD symptoms were rated by the researchers using the Conners’ Adult ADHD Rating Scale.</p>
<p>Results showed that clinical management was better than group psychotherapy in reducing the total number of ADHD symptoms. However, in participants undergoing group psychotherapy, lower gyrification in the right precuneus and the paracentral gyrus was associated with lower levels of inattention (one of the symptoms of ADHD) after treatment. Conversely, across the whole sample, higher gyrification was associated with stronger overall symptom reduction.</p>
<p>Similarly, among participants receiving methylphenidate, individuals with lower gyrification in the left rostral middle frontal gyrus tended to have lower levels of hyperactivity. However, methylphenidate was not more effective than placebo in reducing ADHD symptoms.</p>
<p>“Results revealed significant positive region-specific associations between cortical gyrification and treatment response across three symptom dimensions, with significant associations localized predominantly in frontal regions of the left hemisphere. Our findings emphasize that increased cortical gyrification in frontal cortical regions signifies enhanced treatment efficacy following a 12-week intervention,” the study authors concluded.</p>
<p>The study contributes to the scientific knowledge about methods for treating ADHD. However, neither of the main treatments used in this study (group psychotherapy and methylphenidate medication) was more effective than the corresponding control conditions.</p>
<p>Additionally, participants were aware whether they were undergoing group psychotherapy or the clinical management condition, leaving room for the Hawthorne effect to have affected the results. The Hawthorne effect happens when study participants change their behavior because they know they are being observed and that they are participating in a study.</p>
<p>The paper, “<a href="https://doi.org/10.1038/s41398-025-03681-0">Cortical gyrification predicts initial treatment response in adults with ADHD,</a>” was authored by Jonathan Laatsch, Frederike Stein, Simon Maier, Swantje Matthies, Esther Sobanski, Barbara Alm, Ludger Tebartz van Elst, Axel Krug, and Alexandra Philipsen.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/most-children-identified-as-gifted-at-age-7-do-not-maintain-high-cognitive-ability-by-adolescence/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Most children identified as gifted at age 7 do not maintain high cognitive ability by adolescence</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Nov 29th 2025, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research indicates that identifying children as having high cognitive ability at a young age is often an unreliable predictor of their future intellectual performance. The study found that intelligence scores fluctuate significantly during childhood and adolescence, suggesting that stable patterns of cognitive ability typically do not emerge until approximately age 12. These findings were published in the journal <em><a href="https://icajournal.scholasticahq.com/article/144062-developmental-changes-in-high-cognitive-ability-children-the-role-of-nature-and-nurture" target="_blank">Intelligence & Cognitive Abilities</a></em>.</p>
<p>The study was conducted by <a href="https://www.linkedin.com/in/angel-blanch-69815a124/" target="_blank">Angel Blanch</a>, <a href="https://sescorial.wordpress.com/blog/" target="_blank">Sergio Escorial</a>, and <a href="https://sites.google.com/site/colomresearch/Home" target="_blank">Roberto Colom</a>. These researchers are professors affiliated with the Universitat de Lleida, the Universidad Complutense de Madrid, and the Universidad Autónoma de Madrid. They sought to address a persistent question regarding the stability of human intelligence during the first two decades of life.</p>
<p>Previous scientific data indicates that general cognitive ability becomes highly stable as individuals reach adulthood. But stability values tend to be much lower during early childhood. </p>
<p>The researchers aimed to determine which specific factors distinguish children who maintain early high scores from those whose scores decline over time. They also investigated whether environmental factors or genetic predispositions play a larger role in these developmental shifts.</p>
<p>“We sought to shed light on the issue of whether it is reliable to identify high cognitive ability children at premature stages in development. Given that <a href="https://psycnet.apa.org/doiLanding?doi=10.1037%2Fa0035893" target="_blank">measures show relatively low reliability at early ages</a>, we expected remarkable cognitive/intellectual changes across childhood and early adolescence,” the researchers told PsyPost.</p>
<p>For their study, the researchers analyzed data from <a href="https://www.kcl.ac.uk/research/teds-study" target="_blank">the Twins Early Development Study</a>. This is a large longitudinal project based in the United Kingdom that tracks the development of twins from birth through early adulthood. From the larger cohort, the researchers selected a dataset comprising 11,119 individuals to observe trajectories of cognitive development.</p>
<p>The researchers focused on two specific groups based on standardized test scores obtained at age 7. The first group consisted of 3,958 individuals with normative ability scores ranging between 99 and 115. The second group included 1,580 individuals with high ability scores exceeding 115.</p>
<p>The analysis utilized general cognitive ability scores that were collected at ages 4, 7, 12, 16, and 21. By using data points spanning 17 years, the team could observe long-term trajectories. They employed statistical techniques known as latent curve models. This method allowed them to estimate both the starting point of intelligence scores and the rate of change as the children aged.</p>
<p>In addition to intelligence scores, the researchers examined potential predictors of change. These included measures of the home environment and behavioral problems, which were assessed at multiple time points. The researchers also incorporated time-invariant predictors such as socioeconomic status and school engagement.</p>
<p>The study also included polygenic scores in the analysis. These are genetic indicators derived from DNA analysis that estimate an individual’s genetic predisposition for certain traits. By including both environmental and genetic variables, the authors hoped to disentangle the influences of nature and nurture.</p>
<p>The analysis revealed substantial fluctuations in intelligence scores as the children aged. A majority of the participants did not maintain their relative rank within the distribution of scores. For children classified in the high ability group at age 7, only 16 percent maintained their high scores by age 16.</p>
<p>Movement occurred in the normative group as well, though it was less frequent. Approximately 8 percent of children with average scores at age 7 moved into the high ability range by age 16. The data indicates that retaining a high classification is more likely than achieving one from an average starting point, yet stability remains low for both groups during these years.</p>
<p>The researchers found that personal factors played a larger role than situational ones in predicting these changes. Polygenic scores and socioeconomic status consistently predicted cognitive trajectories. Children with higher genetic predispositions for intelligence tended to have more positive rates of change as they grew older.</p>
<p>This pattern aligns with a concept in behavioral genetics known as the Wilson effect. This effect describes how the influence of genetics on cognitive ability tends to increase with age, while the influence of the shared environment decreases. </p>
<p>Conversely, environmental factors such as home chaos or life events showed weak associations with cognitive changes in the high ability group. High ability children appeared largely unaffected by variations in their home environment or behavioral problems. They seemed to possess a level of resilience against situational stressors regarding their cognitive development.</p>
<p>“Polygenic scores (genetic potential) instead of situational factors such as home and school environments (behavior problems, home chaos, life events, etc.) predict the identified cognitive ability changes as children age,” the researchers told PsyPost. “This pattern is especially true for high cognitive ability children.”</p>
<p>For the normative group, however, the results were slightly different. Negative life events and behavioral problems were associated with decreases in cognitive scores for these children. This suggests that children with average cognitive ability may be more sensitive to environmental instability than their high-ability peers.</p>
<p>School engagement also emerged as a significant predictor. Higher levels of engagement at age 16 were associated with increases in general cognitive ability. This relationship held true for both the normative and high ability groups.</p>
<p>The researchers observed that the correlation between intelligence scores and the rate of change increased as the participants aged. At age 7, a child’s score had little relation to how much their score would change in the future. By age 21, the relationship between a person’s score and their developmental trajectory was much stronger.</p>
<p>These findings have practical implications for educational policy and the identification of gifted children. The high mobility of cognitive scores suggests that many children labeled as such will not maintain that status.</p>
<p>“Premature identification of high cognitive ability children is not warranted,” the researchers explained. “Planned follow-ups are required because we found that most children are intellectually mobile as they age. Of those scoring one standard deviation above the mean at age 7, only a tiny minority preserve their high ability marks afterwards.”</p>
<p>There are limitations to this study that should be noted. The dataset did not include neuroimaging data, such as MRI scans. This means the researchers could not directly observe the structural brain changes that accompany these shifts in cognitive ability.</p>
<p>“The lack of brain data in the TEDS dataset precludes the analysis of the relationship between the identified cognitive ability changes and the brain changes <a href="https://pubmed.ncbi.nlm.nih.gov/30829509/" target="_blank">we already know occur until cognitive maturity</a> is achieved (16 yrs. on average),” the researchers noted. “Connecting genetic potential, brain development, and cognitive ability is crucial for a better understanding of the identified changes.”</p>
<p>Future research in this area aims to incorporate biological data. The researchers express a desire to obtain brain scans of individuals who maintained their high cognitive scores versus those who did not. Investigating structural and functional brain differences could provide a biological explanation for the cognitive mobility observed in this study.</p>
<p>“It would be great to obtain brain data of the individuals showing intellectual precocity that kept their high cognitive marks and those who lost their high scores,” the researchers said. “Do the former show better structural and functional brain features than the latter?”</p>
<p>“Available neuroscientific research findings allow stating predictions regarding potential differences at the brain level between these two cognitive ability profiles. We think getting MRI scans of these folks is highly worthwhile to enhance our current knowledge.”</p>
<p>The study, “<a href="https://icajournal.scholasticahq.com/article/144062-developmental-changes-in-high-cognitive-ability-children-the-role-of-nature-and-nurture" target="_blank">Developmental Changes in High Cognitive Ability Children: The Role of Nature and Nurture</a>,” was authored by Angel Blanch, Sergio Escorial, and Roberto Colom.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>