<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-study-identifies-a-woke-counterpart-on-the-political-right-characterized-by-white-grievance/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New study identifies a “woke” counterpart on the political right characterized by white grievance</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 19th 2026, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in the <em><a href="https://doi.org/10.1111/sjop.70070" target="_blank" rel="noopener">Scandinavian Journal of Psychology</a></em> provides evidence that identity-based political attitudes, often described colloquially as “woke,” are not exclusive to the political left. The study suggests that a parallel ideology exists on the political right, characterized by a focus on white identity grievance and a desire to regulate speech in favor of conservative values. These findings indicate that while the specific contents of these belief systems differ, they share a structural similarity in how they view group dynamics and societal control.</p>
<p>Oskari Lahtinen, a senior researcher at the INVEST Research Flagship Centre at the University of Turku, conducted the new study to address a gap in political psychology. While the concept of “woke” or critical social justice attitudes has been widely discussed and measured, Lahtinen observed that a similar phenomenon appeared to be emerging among right-wing populations.</p>
<p>He noted that media outlets and cultural commentators have increasingly used terms like “right-wing woke” or “the identitarian right” to describe this trend. Lahtinen sought to create a psychometrically valid tool to measure these specific right-wing attitudes. He also aimed to refine an existing scale for measuring left-wing critical social justice attitudes to allow for a direct comparison between the two ideological extremes.</p>
<p>“For years, ‘wokeness’ has been discussed almost exclusively as a left-wing phenomenon,” Lahtinen told PsyPost. “More recently, similar identity-based and speech-regulating politics have been emerging on the right, as noted by outlets like The New York Times, The Atlantic, and The Economist. In 2024, I developed <a href="https://www.psypost.org/study-woke-attitudes-linked-to-anxiety-depression-and-a-lack-of-happiness/" target="_blank" rel="noopener">a measure for critical social justice</a> (often called ‘woke’) attitudes. In this study, I devised a psychometrically sound scale for critical right (‘woke right’) attitudes, to make it easier to empirically study current political polarization. I also revised the 2024 CSJAS scale for left-wing ‘woke’ attitudes.”</p>
<p>To conduct the research, Lahtinen recruited a sample of 626 Finnish adults through an online survey. The data collection occurred between March and June of 2025. The researcher used a convenience sampling method. To ensure enough participation from right-wing individuals, who might otherwise be underrepresented in academic surveys, the study link was distributed via the social media channels of politicians from the Finns Party and the National Coalition Party.</p>
<p>The participants completed two primary measures designed for this study. The first was the newly developed Critical Right Scale (CRS). This scale began with eighteen potential questions derived from right-wing political speeches, social media content, and forums.</p>
<p>Through statistical analysis, the researcher narrowed this down to a final five-item scale that showed high reliability. The items assessed beliefs regarding population replacement, the perception that society discriminates against white people, and the idea that conservative values should dictate permissible speech.</p>
<p>The second measure was the Revised Critical Social Justice Attitudes Scale (CSJAS-R). This was an updated version of a scale Lahtinen had developed in previous research. It consisted of six items designed to measure support for ideas such as structural racism, the necessity of safe spaces, and the inclusion of trans women in female sports categories. The survey also included questions about the participants’ political party affiliation, their self-placement on a left-right axis, and their views on the justification of political violence.</p>
<p>The results provided evidence that the Critical Right Scale and the Critical Social Justice Attitudes Scale measure two distinct, opposing constructs. The two scales had a strong negative correlation, meaning that individuals who scored high on one almost invariably scored low on the other. The statistical analysis showed that both scales were reliable and valid for use with both male and female participants.</p>
<p>“I was surprised by how neatly the two scales behaved psychometrically and how cleanly they divided into two constructs that share a strong negative correlation,” Lahtinen said.</p>
<p>Lahtinen found that high scores on the Critical Right Scale were strongly concentrated among voters for the Finns Party and the Christian Democrats. The specific beliefs driving these scores included the notion that a “great replacement” of the population is occurring and that a strong leader should break rules to protect national interests. These respondents also tended to agree that “regular people” know what is better for the country than experts do.</p>
<p>On the other side of the spectrum, high scores on the Critical Social Justice Attitudes Scale were found primarily among voters for the Left Alliance and the Greens. These participants endorsed views such as the idea that income disparities between white and black people are mostly due to racism. They also supported the concept that speech perceived as a “microaggression” should be actively challenged.</p>
<p>One significant finding related to the behavioral correlates of these attitudes. The study assessed whether participants believed that violence against “politically dangerous” people could be justified. The data showed that high scores on the left-wing Critical Social Justice scale had a small-to-moderate positive association with the justification of political violence. In contrast, the right-wing Critical Right Scale showed no meaningful statistical association with the approval of political violence in this specific dataset.</p>
<p>The research also explored the relationship between these political attitudes and personal well-being. Previous studies had suggested a link between critical social justice attitudes and lower mental well-being. This study found a very weak negative correlation between the left-wing scale and happiness. There was no correlation found between the right-wing scale and happiness.</p>
<p>The study also examined the psychological concept of “locus of control,” which refers to whether people feel they have control over their own lives or if outside forces dictate their outcomes. The results indicated that those with high critical social justice attitudes were much more likely to have an external locus of control, believing that structures or other people were responsible for their well-being. On the other hand, those with critical right attitudes tended to have an internal locus of control.</p>
<p>“The main takeaway is that identity-based ‘woke’ politics now exist on both the left and the right, but they take different forms,” Lahtinen told PsyPost. “On the left, ‘woke’ attitudes involved ideas like structural racism and the need for safe spaces, whereas on the right they were defined by beliefs about population replacement and perceived discrimination against white people. In this dataset, left-wing ‘woke’ attitudes showed a stronger association than before with the belief that violence against ‘politically dangerous’ people is justified.”</p>
<p>This research builds upon Lahtinen’s 2024 study, which initially attempted to operationalize “woke” attitudes. The current paper refines those original measures and expands the theoretical framework to include the political right.</p>
<p>The findings lend partial support to the “horseshoe theory” of politics, which suggests that the far-left and far-right may act in similar ways despite having opposite goals. Both groups in this study demonstrated a desire to regulate speech and emphasized identity-based grievances, though the specific identities they championed were diametrically opposed.</p>
<p>But as with all research, there are some limitations. The sample was not random, as it relied on self-selection and targeted recruitment through political channels. This means the results may not perfectly represent the general Finnish population. Additionally, the study was conducted solely in Finland. While American identity politics have influenced Northern Europe, the cultural context in Finland differs from that of the United States or the United Kingdom.</p>
<p>“The results should not be read as claims about entire populations or political parties,” Lahtinen noted. “The study does not say that ‘most people’ on the left or on the right support violence or conspiracy theories. It identifies patterns within a non-random sample and shows how certain attitudes cluster together.”</p>
<p>Future research could focus on validating the Critical Right Scale in other countries, particularly in the United States where these discourses often originate. It would also be beneficial to use larger, representative samples to confirm the prevalence of these attitudes in the general public. Longitudinal studies could help determine if these ideological clusters are stable over time or if they fluctuate with current events.</p>
<p>“The next step is replication in a larger sample and preferably in other countries,” Lahtinen said. “More broadly, my goal is to make novel political concepts measurable, even if they are culturally charged. I would also like to integrate novel methods into my research, like machine learning. Please see this study as an invitation for U.S. colleagues to pick this line of research up and validate the scale in the country where these attitudes originate from.”</p>
<p>The study, “<a href="https://doi.org/10.1111/sjop.70070" target="_blank" rel="noopener">Two Kinds of ‘Woke’ – Psychometric Validation of the Critical Right Scale and Revised Critical Social Justice Attitudes Scale</a>,” was authored by Oskari Lahtinen.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/common-supplement-cocktail-triggers-surprising-brain-changes-in-mouse-models-of-autism/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Common supplements, when combined, trigger surprising brain changes in mouse models of autism</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 19th 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study published in <em><a href="https://doi.org/10.1371/journal.pbio.3003231" target="_blank">PLOS Biology</a></em> suggests that a specific cocktail of dietary supplements may help alleviate behavioral challenges associated with autism spectrum disorder. The research provides evidence that mixing low doses of zinc, serine, and branched-chain amino acids can improve social behaviors and brain connectivity in mice. These findings imply that addressing metabolic needs could offer a potential therapeutic avenue for various forms of the condition.</p>
<p>Autism spectrum disorder is a complex neurodevelopmental condition characterized by difficulties in social interaction and communication. It arises from a combination of genetic variations and environmental influences that affect how the brain develops. A primary area of focus for scientists is the synapse, which is the connection point where neurons communicate with one another. When these connections fail to form or function correctly, it can lead to the altered neural connectivity seen in autism.</p>
<p>Nutrition is a significant environmental factor that plays a fundamental role in maintaining brain health. Previous scientific inquiries established that specific nutrients are essential for building and maintaining these synaptic connections. Zinc is highly concentrated in synaptic vesicles and helps regulate the signals sent between neurons. Serine is an amino acid that modulates receptors critical for learning and memory. Branched-chain amino acids serve as building blocks for protein synthesis in the brain.</p>
<p>Scientists have observed that individuals with autism often exhibit dietary biases or gastrointestinal issues. While supplements have been proposed as a treatment, taking high doses of single nutrients for long periods can cause side effects. For example, excessive zinc intake can interfere with copper absorption. High levels of specific amino acids can disrupt the body’s nitrogen balance or kidney function. The authors of this current study hypothesized that combining these nutrients at lower doses might offer a safer way to support brain function.</p>
<p>“Because autism spectrum disorder (ASD) arises from highly heterogeneous genetic and environmental causes, effective treatments remain limited. Rather than pursuing highly specific drugs tailored to individual autism subtypes, my lab aims to identify safer interventions with broad applicability across diverse etiologies,” said study author <a href="https://www.imb.sinica.edu.tw/en/faculty/profile/hsueh.html" target="_blank">Yi-Ping Hsueh</a>, a distinguished research fellow at the Institute of Molecular Biology at Academia Sinica.</p>
<p>To test their hypothesis, the research team employed three different mouse models, each representing a different genetic cause of autism. They primarily focused on mice with a mutation in the Tbr1 gene, which is a well-established model for the disorder. They also utilized mice with mutations in the Nf1 and Cttnbp2 genes. This approach allowed them to see if the treatment could be effective across different genetic backgrounds that share similar synaptic deficits.</p>
<p>The researchers began by analyzing the protein composition of the brains of Tbr1 mutant mice. They compared these profiles to those of typical wild-type mice to identify discrepancies. The analysis revealed that the mutant mice had significantly lower levels of specific proteins involved in synaptic transmission and structure. The team termed this cluster of affected proteins the “Black module.”</p>
<p>Following this discovery, the researchers administered a “cocktail” of supplements to the mice for one week. This mixture contained low doses of zinc, serine, and branched-chain amino acids. Afterward, they performed another analysis of the brain proteins. The results indicated that the supplementation increased the levels of the proteins in the Black module. This suggests that the nutrient mixture helped shift the protein profile of the mutant brains closer to a normal state.</p>
<p>The team then investigated how these molecular changes translated to actual brain activity. They focused on the basolateral amygdala, a brain region critical for processing social information. To do this, they injected a virus into the mouse brains that causes neurons to light up when they are active. They then implanted tiny microscopes into the skulls of the mice, allowing them to record neural activity while the animals moved freely.</p>
<p>The imaging data revealed that neurons in the basolateral amygdala of the Tbr1 mutant mice behaved abnormally during social interactions. Specifically, these neurons were hyperactive and overly connected to one another. This means the neurons tended to fire in a highly synchronized manner that is not typically seen in wild-type mice. This hyperconnectivity implies that the brain circuits were responding excessively to social stimuli.</p>
<p>The researchers found that treating the mice with the nutrient cocktail normalized this activity. The imaging showed that the supplement regimen reduced the excessive synchronization among the neurons. The functional connectivity of the basolateral amygdala in the treated mutant mice began to resemble that of the wild-type mice. This provides evidence that the nutrients can physically alter how brain circuits function in real-time.</p>
<p>To determine if these physiological changes resulted in observable benefits, the scientists conducted a series of behavioral tests. One of the primary assessments was the reciprocal social interaction test. In this experiment, a test mouse is placed in a cage with an unfamiliar mouse, and researchers measure how much time they spend interacting.</p>
<p>The researchers found that Tbr1 mutant mice typically spend less time interacting with strangers than wild-type mice do. The researchers then tested the effects of the supplements individually and in combination. They discovered that low doses of zinc, serine, or branched-chain amino acids alone did not lead to significant behavioral improvements. However, when these low doses were combined into a single mixture, the social behavior of the mice improved notably.</p>
<p>The effectiveness of the mixture varied depending on the specific genetic mutation of the mouse. The Tbr1 mice showed significant improvement with a cocktail containing one-quarter of the standard dose. Mice with the Nf1 mutation required a mixture at half the standard concentration. Interestingly, the mice with the Cttnbp2 mutation were highly sensitive and responded to a cocktail containing only one-eighth of the standard dose. This variation suggests that different genetic conditions may result in different metabolic needs.</p>
<p>In addition to social interaction, the researchers assessed memory and sociability using other standard tests. In the three-chamber test, a mouse chooses between exploring a chamber with another mouse or a chamber with an object. Treated Tbr1 mice showed a preference for the chamber with the other mouse, comparable to healthy mice. The team also used a fear conditioning test to evaluate associative memory. The mutant mice treated with the cocktail demonstrated improved memory retention compared to untreated mutants.</p>
<p>“Nutrients play a fundamental role in maintaining brain health. Our studies show that zinc, branched-chain amino acids, and serine can positively influence neuronal function and activity. When combined, these nutrients act synergistically to improve social behaviors in three different genetic mouse models of autism.” </p>
<p>“In principle, this approach may also be relevant to other autism conditions that share synaptic dysfunction as a common feature,” Hsueh told PsyPost. “Although most people can tolerate mild nutrient deficiencies, our findings reinforce the importance of a balanced and adequate diet for optimal nervous system function.”</p>
<p>The study included long-term monitoring to check for potential adverse effects. The researchers provided the supplement cocktail to the mice starting from weaning and continuing through adulthood. They monitored the body weight of the animals and conducted tests for anxiety and general locomotion. The data showed that the long-term supplementation did not negatively affect growth or increase anxiety levels.</p>
<p>But there are important limitations to consider regarding this research. The study was conducted entirely using mouse models. While mice share many biological pathways with humans, their brain architecture and social behaviors are much simpler. A treatment that works in a controlled animal study does not always translate to success in human clinical trials.</p>
<p>The use of animal models is necessary in this stage of research because it allows scientists to look directly at brain tissue and neural activity. It would be impossible to perform the same invasive detailed protein analysis or deep-brain calcium imaging in living humans. These models provide the proof of concept needed to justify further investigation.</p>
<p>It is also important to recognize that this intervention is not a cure for the underlying genetic conditions. The supplements did not repair the mutated genes. Instead, they appeared to compensate for the functional deficits caused by those mutations. This suggests that any potential therapy based on these findings would likely need to be ongoing throughout a person’s life.</p>
<p>“Although dietary supplementation does not completely cure autism, it is readily accessible and can be implemented immediately,” Hsueh said. “Even modest improvements have the potential to meaningfully enhance daily functioning and quality of life for affected individuals and their families.”</p>
<p>“Nutrient deficiency in individuals with autism is not necessarily due to insufficient intake, but rather to increased physiological demand. In some patients, mutations in genes involved in nutrient absorption or metabolism exacerbate these deficiencies. More broadly, many autism conditions appear to require higher levels of specific nutrients to compensate for synaptic dysfunction. Although the underlying causes vary, dietary supplementation has the potential to alleviate neuronal impairments by supporting synaptic function.”</p>
<p>Future research will need to focus on how these findings might apply to human biology. Clinical trials would be required to determine the safe and effective dosages for people. Additionally, researchers will need to investigate whether this specific combination of nutrients is effective for the wide variety of genetic causes found in the human autism spectrum.</p>
<p>“The strategy of dietary supplementation for autism emerged from basic research that was not initially intended for clinical application,” Hsueh noted. “These findings once again highlight the power of fundamental research to uncover unexpected therapeutic potential. My laboratory will continue to focus on what we do best—basic research—to deepen our understanding of autism using mouse models and to identify more effective strategies for improvement.”</p>
<p>The study, “<a href="https://doi.org/10.1371/journal.pbio.3003231" target="_blank">Low-dose mixtures of dietary nutrients ameliorate behavioral deficits in multiple mouse models of autism</a>,” was authored by Tzyy-Nan Huang, Ming-Hui Lin, Tsan-Ting Hsu, Chen-Hsin Yu, and Yi-Ping Hsueh.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/viewing-nature-pictures-helps-adolescents-recover-from-social-exclusion/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Viewing nature pictures helps adolescents recover from social exclusion</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 18th 2026, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>An experimental study in Spain found that exposure to social ostracism reduces the positive affect and perceived social competence of adolescents. However, both of these characteristics were restored in adolescents who watched pictures of nature scenes afterwards. The paper was published in the <a href="https://doi.org/10.1016/j.jenvp.2025.102790"><em>Journal of Environmental Psychology</em></a>.</p>
<p>Social ostracism is the deliberate exclusion of an individual or group from social interactions, relationships, or participation in a community. It can take subtle forms, such as being ignored or left out, or more explicit forms like shunning, public rejection, or expulsion from a group.</p>
<p>Ostracism has been practiced throughout history in many societies, including in ancient city-states, religious communities, traditional villages, and modern organizations. In some cultures and groups, such as tightly knit religious sects or honor-based communities, ostracism is used as a mechanism of social control and norm enforcement. In contemporary settings, it can occur in workplaces, schools, online communities, and social media through exclusion, silence, or coordinated ignoring.</p>
<p>Social ostracism frustrates the basic human need for belonging, reduces self-esteem, and damages a sense of meaningful existence. Even short episodes of exclusion can produce strong emotional responses such as distress, anger, sadness, and anxiety. Prolonged or repeated ostracism is associated with depression, loneliness, reduced cognitive performance, and increased stress-related health problems. In some cases, ostracized individuals may withdraw socially, while in others they may respond with aggression or attempts to regain acceptance at any cost.</p>
<p>Study author Adrián Moll and his colleagues note that previous studies indicated that contact with nature can ameliorate the adverse consequences of daily difficulties. Contact with nature was found to enhance a sense of being away from daily demands, and to reduce stress and anxiety. They conducted a study aiming to explore whether exposure to nature could help ameliorate the adverse effects of social ostracism.</p>
<p>Study participants were 304 adolescents from two secondary schools in Spain. Approximately 47% of them were male. Their ages ranged between 12 and 18 years.</p>
<p>Study participants were divided into 6 groups, each of which underwent a different combination of experimental conditions. Three groups were exposed to social ostracism, while the other three experienced social inclusion (the opposite of ostracism). Within each of these sets of three groups, one group viewed pictures of nature after the first treatment (which was ostracism or inclusion, depending on the group), one group viewed pictures of urban areas, while the third group viewed neutral pictures (e.g., an arrow, a pen, mathematical symbols).</p>
<p>The study was conducted in participants’ classes. At the start, each participating student was asked to write the names of five classmates he/she would choose to work with in a group on a piece of paper and hand it to their teacher. The teacher was cooperating with the researchers and left the classroom after collecting the papers. The teacher pretended that he/she was going outside to read the papers and match students with classmate choices.</p>
<p>In reality, the feedback participating students received did not depend on those papers. When the teacher returned, students assigned to the ostracism group were simply informed that “Almost no one in the class wants to group with you to work on the class project: your name has been written by less than three people”. Students assigned to the social inclusion group were told that “Everybody wants to group with you for the class project: a lot of people have written down your name”. After reading this feedback, depending on the group, students viewed pictures assigned to their group on a PowerPoint presentation (10 seconds per picture, 14 pictures in total).</p>
<p>While the teacher was away, study participants completed assessments of affect (the PANAS scale), perceived social competence (“I find it easy to make friends among my classmates”), and directed attention (the Cancellation task). After receiving information on social ostracism or inclusion, participants completed these assessments again, but also an assessment of social pain.</p>
<p>They completed assessments of affect, perceived social competence, and directed attention once more after viewing their assigned picture, but also an assessment of psychological restoration. Directed attention was measured because stressful experiences (such as social ostracism) can be expected to reduce attentional resources. Study authors wanted to see whether this would happen and whether exposure to nature would restore it.</p>
<p>Results showed that being exposed to social ostracism reduced the positive affect and perceived social competence of these participants. However, it had no effects on directed attention. Exposure to nature restored both positive affect and perceived social competence. The other two types of pictures did not have this restorative effect. Directed attention improved regardless of the type of pictures participants viewed, and study authors attribute this to a learning effect.</p>
<p>“Altogether, these findings suggest that visual nature exposure can be a potential positive mechanism for adolescents to recover diminished resources due to social ostracism,” study authors concluded.</p>
<p>The study contributes to the scientific understanding of the psychological effects of contact with nature. However, the study participants were exclusively secondary school students. Effects on other demographic groups might differ.</p>
<p>Additionally, participants experienced just a single, brief, experimentally induced ostracism event (the feedback about being selected or not). This is unlike real-world ostracism where people usually experience many different ostracism cues of different types and during prolonged periods. Study authors also note that the study was conducted in class and that it is possible that students perceived social cues from their classmates that might have sent a different message compared to the ostracism feedback.</p>
<p>The paper, “<a href="https://doi.org/10.1016/j.jenvp.2025.102790">Exposure to nature scenes mitigates the adverse effects of adolescents’ social ostracism,</a>” was authored by Adrián Moll, Silvia Collado, Eleanor Ratcliffe, Miguel Ángel Sorrel, and José Antonio Corraliza.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/growing-up-near-busy-roads-linked-to-higher-risk-of-depression-and-anxiety/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Growing up near busy roads linked to higher risk of depression and anxiety</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 18th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Living near busy roads may increase the risk of developing depression or anxiety, according to a large new study published in <a href="https://doi.org/10.1016/j.envres.2025.122443"><em>Environmental Research</em></a>.</p>
<p>Traffic noise is an unavoidable part of modern city life. Cars, buses, and trains generate constant background sound that many learn to tune out. But scientists have long suspected that this noise may still affect the body, even when one believes they have adapted to it.</p>
<p>Previous research has highlighted that traffic noise can disturb sleep, raise stress levels, and increase the risk of heart disease. In recent years, attention has turned to the effects on mental health.</p>
<p>However, most earlier studies on noise and mental health have focused on middle-aged or older adults. Hence, the researchers behind the Finnish study wanted to understand whether noise exposure during key developmental years might influence the risk of mental health disorders later on.</p>
<p>Led by Yiyan He from the University of Oulu, Finland, the research team accessed nationwide health and population registers. They analyzed data from 114,353 individuals born between 1987 and 1998 who were living in the Helsinki metropolitan area in 2007. At the start of the study, participants were between 8 and 21 years old. The researchers then followed them until 2016, tracking who received a diagnosis of depression or anxiety in specialist healthcare.</p>
<p>Traffic noise levels were estimated for each participant’s home address, including changes if they moved. The researchers focused mainly on road traffic noise and calculated average sound levels over the entire day, with extra weight given to evening and night-time noise, when individuals are more sensitive to sound. They also accounted for many other factors that could influence mental health, such as family background, parental mental illness, neighborhood disadvantage, air pollution, and access to green space.</p>
<p>Over the follow-up period, about one in ten participants received a diagnosis of depression or anxiety by early adulthood. The study found a clear pattern: higher traffic noise exposure was linked to higher risk. For every 10-decibel increase in road traffic noise, the risk of depression rose by about 5 percent, and the risk of anxiety rose by about 4 percent. While these increases may sound small, He and his team noted that noise exposure affects large numbers of people, making the overall public health impact potentially significant.</p>
<p>Importantly, the risk began to rise at around 53 decibels—a level close to the noise limits recommended by the World Health Organization for residential areas. Night-time noise showed similar effects, supporting the idea that disrupted sleep may play a role.</p>
<p>The study also found differences between groups. The link between noise and anxiety was stronger in males than in females. The association with anxiety was also more pronounced among those whose parents did not have diagnosed mental disorders, suggesting that environmental stressors may be especially influential when family risk is lower.</p>
<p>“Sleep disturbance and stress response have been proposed as key mechanisms underlying the association between traffic noise exposure and the risk of depression and anxiety. Traffic noise has been associated with insomnia symptoms … which are identified as risk factors for depression. In addition, traffic noise exposure has been shown to induce annoyance and heightened physiological stress responses,” He and colleagues explained.</p>
<p>The researchers caution that their study does not prove that traffic noise directly causes depression or anxiety, only demonstrating an association. The study also focused on more severe cases diagnosed in specialist care, meaning milder cases treated by general doctors were not included. In addition, the researchers could not measure noise exposure at schools or workplaces, or how well homes were insulated against sound.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.envres.2025.122443">Residential exposure to traffic noise and incidence of depression and anxiety from childhood through adulthood: a Finnish register study</a>,” was authored by Yiyan He, Marius Lahti-Pulkkinen, Johanna Metsälä, Jaana I. Halonen, Jouko Miettunen, Jules Kerckhoffs, Marko Kantomaa, Eero Kajantie, Sylvain Sebert, and Anna Pulakka.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/negative-facial-expressions-interfere-with-the-perception-of-cause-and-effect/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Negative facial expressions interfere with the perception of cause and effect</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 18th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research suggests that the emotional content of a facial expression influences how well observers can predict social outcomes. A series of experiments indicates that people have a harder time recognizing causal links between social cues when the faces involved display negative emotions, such as sadness, anger, or fear. The findings were published in the <em><a href="https://doi.org/10.1177/17470218251395043" target="_blank">Quarterly Journal of Experimental Psychology</a></em>.</p>
<p>Human interaction relies heavily on the ability to predict how one person will react to another. When a speaker smiles, an observer might expect the listener to smile in return. This predictive ability allows people to navigate complex social environments. Psychologists refer to this as contingency learning. It involves calculating the likelihood that a specific outcome will occur given a specific cue.</p>
<p>Researchers have debated how emotional faces fit into this learning process. Some theories propose that threatening or negative faces are evolutionarily important and should grab attention quickly. Other theories suggest that happy faces are easier to process because they are distinct and rewarding. To resolve this, a team of researchers led by Rahmi Saylik from Mus Alparslan University investigated whether specific emotional expressions help or hinder the ability to learn these statistical connections. The research team included Andre J. Szameitat and Adrian L. Williams from Brunel University London, and Robin A. Murphy from the University of Oxford.</p>
<p>The researchers aimed to understand if the “valence” of an emotion—whether it is positive or negative—affects the computation of cause and effect. They questioned if people are better at learning patterns when the faces are happy compared to when they are sad. They also sought to determine if this learning is based on genuine statistical evidence or simple observation of how often two things occur together.</p>
<p>To test this, the investigators designed a computer-based task using a “streaming” procedure. Participants watched a rapid series of images flash on a screen. In the emotional conditions, they saw two faces. One face represented a “sender” and the other a “receiver.” The participants’ goal was to determine if the expression on the first face caused the expression on the second face.</p>
<p>In Experiment 1, the researchers recruited 107 participants. The participants viewed streams of images involving happy faces, sad faces, or geometric shapes. The shapes served as a control condition to measure learning without social or emotional content. The researchers manipulated the statistical strength of the relationships. In some blocks, the cue perfectly predicted the outcome. In others, there was no relationship at all.</p>
<p>The participants provided ratings on a scale from negative to positive to indicate how strong they felt the causal link was. The results showed that participants could generally distinguish between strong and weak relationships. However, the type of stimulus altered their judgment. Participants perceived a weaker causal connection when viewing sad faces compared to happy faces or geometric shapes. The ratings for sad faces were less accurate in relation to the actual statistical evidence.</p>
<p>The researchers suspected that the visual differences between the photos and the simple shapes might have influenced the results. To address this, they conducted Experiment 2 with 82 new participants. They modified the stimuli to make the shapes and faces more visually comparable. They used black-and-white images and presented the faces through oval windows. They also created patterned shapes that mimicked the presence or absence of a feature, similar to how a face shows an emotion or remains neutral.</p>
<p>Despite these changes, the pattern of results remained the same. Participants consistently rated the causal relationships involving sad faces as weaker than those involving happy faces or the patterned shapes. There was no statistical difference between the ratings for happy faces and the neutral shapes. This suggested that happy faces did not necessarily boost performance, but rather that sad faces actively impaired the perception of causality.</p>
<p>A potential criticism of these findings is that participants might not be calculating complex statistics. They might simply be counting how often they see two emotional faces appear together. This is known as the “pairing hypothesis.” In Experiment 3, the researchers tested 90 participants to rule this out. They created specific conditions where the number of pairings was high, but the statistical predictive power was low. Conversely, they created conditions with few pairings but high predictive power.</p>
<p>The results from Experiment 3 confirmed that participants were indeed tracking the statistical contingency, not just the frequency of pairings. Even when the number of pairings was held constant, participants rated the stronger statistical connections higher. However, the emotional interference persisted. Sad faces continued to elicit lower ratings of causal strength compared to happy faces and shapes, regardless of how the statistics were presented.</p>
<p>In the final study, Experiment 4, the researchers expanded the scope to include other negative emotions. They wanted to see if the effect was specific to sadness or if it applied to aversive emotions in general. They recruited 51 participants and tested them using happy, angry, and fearful faces. The procedure mirrored the earlier experiments, asking participants to judge the strength of the relationship between the cues and outcomes.</p>
<p>The findings revealed that the interference effect was not unique to sadness. Participants perceived a weaker sense of causality when observing angry or fearful faces compared to happy ones. The ratings for the angry and fearful conditions were lower than those for the happy condition in scenarios where a positive relationship existed. This suggests that stimuli with negative valence generally disrupt the processing of contingency information.</p>
<p>The researchers interpreted these results through the lens of attention and cognitive resources. While threatening or negative faces are highly salient and grab attention quickly, they may also trigger task-irrelevant processing. For example, a sad or angry face might induce a state of worry or physiological arousal in the observer. This internal reaction could consume cognitive resources that would otherwise be used to track the statistical patterns in the environment.</p>
<p>Consequently, while the observer notices the face, they may have less mental bandwidth available to calculate the relationship between that face and the subsequent outcome. Happy faces, being pleasant and signaling safety, do not impose this cognitive tax. This allows the observer to focus on the structural relationship between the social cues. The study challenges the idea that “threat” enhances all forms of learning. It suggests that while threats are noticed quickly, they may hinder the analysis of the broader context.</p>
<p>There are limitations to the study that warrant mention. The experiments relied on static images presented on a computer screen, which is different from dynamic, real-world interactions. Additionally, while the researchers attempted to match the visual properties of the control shapes, non-emotional objects are inherently different from human faces. The study focused on neurotypical university students, so the results may not generalize to clinical populations with anxiety or depression.</p>
<p>Future research could investigate the speed of these judgments to understand the processing time required for different emotions. It would also be beneficial to use physiological measures to track arousal levels during the task. Understanding how negative emotions disrupt causal learning could have implications for understanding social misunderstandings. If negative expressions make social patterns harder to read, it could explain some difficulties in maintaining relationships during times of conflict or distress.</p>
<p>The study, “<a href="https://doi.org/10.1177/17470218251395043" target="_blank">Sad, Angry and Fearful Facial Expressions Interfere With Perception of Causal Outcomes</a>,” was authored by Rahmi Saylik, Andre J. Szameitat, Adrian L. Williams and Robin A. Murphy.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/study-links-unpredictable-childhoods-to-poorer-relationships-via-increased-mating-effort/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Study links unpredictable childhoods to poorer relationships via increased mating effort</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 18th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People who grew up in harsher or more unpredictable environments tend to report poorer romantic relationships in adulthood, partly because they invest more effort in seeking new partners. This study was published in <a href="https://doi.org/10.1177/14747049251355861"><em>Evolutionary Psychology</em></a>.</p>
<p>Decades of psychological research demonstrates that early family environments shape adult romantic relationships. Individuals exposed to instability, conflict, or economic hardship in childhood are more likely to experience lower relationship satisfaction and higher conflict later in life. These links have traditionally been explained through attachment theory, which focuses on how early interactions with caregivers shape our expectations about closeness, trust, and emotional security in adult partnerships.</p>
<p>Monika Kwiek and colleagues sought to broaden this perspective by integrating attachment theory with life history theory, an evolutionary framework that emphasizes how early environments shape long-term strategies for mating and parenting. While attachment theory centers on emotional bonds, life history theory highlights how people allocate effort toward seeking partners (mating effort) versus investing in children and long-term family life (parenting effort).</p>
<p>The researchers recruited 332 Polish adults (average age of 39), who had children. These participants were recruited through psychology students at Jagiellonian University. This middle-aged Eastern European sample allowed the researchers to test theories that are often examined primarily in North American student populations.</p>
<p>Participants completed a series of questionnaires assessing key aspects of their current romantic relationships, attachment orientations, mating and parenting tendencies, and childhood environments.</p>
<p>Romantic relationship satisfaction was measured via seven items that captures how well a relationship meets expectations and overall happiness with one’s partner (e.g., <em>“My relationship has met my original expectations”</em>). Relationship conflict was assessed by asking about the frequency and intensity of disagreements, particularly those involving mistrust or emotional reactivity (e.g., <em>“My partner and I often argue because I do not trust him/her”</em>).</p>
<p>Attachment styles were measured along two dimensions, attachment anxiety and attachment avoidance, reflecting tendencies toward fear of abandonment and discomfort with closeness, respectively (e.g., “<em>I need a lot of reassurance that I am loved by my partner”</em> and <em>“I want to get close to my partner, but I keep pulling back”</em>).</p>
<p>Mating effort was assessed through items capturing competitiveness with peers, flirting, and pursuit of new or unavailable partners, while parenting effort focused on caregiving, emotional support, and investment of resources in children and family life.</p>
<p>The researchers also collected retrospective reports of childhood harshness and unpredictability, such as financial instability, frequent moves, or parental absence, as well as perceptions of neighborhood safety and social cohesion during upbringing. Information about parental education was included as an additional indicator of early socioeconomic context. All measures were translated into Polish using validated procedures.</p>
<p>The results revealed that mating effort was a key pathway linking early environments to adult romantic relationship quality. Individuals who grew up in harsher or more unpredictable conditions reported higher mating effort in adulthood, which was associated with lower relationship satisfaction and greater conflict. These associations held even after accounting for attachment anxiety and avoidance, suggesting that mating effort uniquely contributed to relationship outcomes rather than simply reflecting insecure attachment.</p>
<p>Although early environments were related to later parenting investment, parenting effort itself was not associated with romantic relationship satisfaction or conflict. This suggests that orientations toward caregiving may be shaped by childhood conditions without directly influencing the quality of the romantic partnership, whereas mating-related behaviors appear more closely tied to couple dynamics.</p>
<p>Attachment styles were differentially related to life history dimensions. Higher attachment avoidance, but not attachment anxiety, was associated with lower parenting effort, while neither attachment dimension was meaningfully related to mating effort.</p>
<p>Together, these findings suggest that mating strategies and attachment orientations reflect partly independent pathways through which early experiences shape adult romantic relationships, with mating effort playing a more direct role in relationship satisfaction and conflict.</p>
<p>Of note is that the sample was highly educated and predominantly female, limiting generalizability of findings, and reflecting a common bias in life history research toward populations associated with slower life history strategies.</p>
<p>The study, “<a href="https://doi.org/10.1177/14747049251355861">Life History, Attachment and Romantic Relationship Outcomes in an Eastern European Adult Sample</a>,” was authored by Monika Kwiek, Daniel J. Kruger, and Przemyslaw Piotrowski.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/a-common-side-effect-of-antidepressants-could-be-a-surprising-warning-sign/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">A common side effect of antidepressants could be a surprising warning sign</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 18th 2026, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new study suggests that for people treating depression with common medications, the frequency of their yawning might track with how well they are healing. Published in the <em><a href="https://doi.org/10.1016/j.jpsychires.2025.10.075" target="_blank">Journal of Psychiatric Research</a></em>, the findings indicate that while these drugs often increase yawning overall, persistent or worsening yawning could signal that the treatment is not effectively reducing depressive symptoms. This offers clinicians a potential physical marker to help distinguish between medication side effects and the lingering symptoms of the disorder itself.</p>
<p>Selective serotonin reuptake inhibitors, or SSRIs, are the most prescribed medications for treating depression. While they are generally considered safe, they come with a range of physical side effects that can affect a patient’s quality of life. One of the stranger and less understood reactions is excessive yawning. This is not the yawning of boredom or simple tiredness but a distinct physiological response to the medication.</p>
<p>The link between yawning and brain chemical activity is well established in animal models. Neurotransmitters like serotonin, dopamine, and acetylcholine play a role in triggering this reflex. However, few researchers have tracked this phenomenon systematically over time in a clinical psychiatric setting. Most existing information comes from isolated case reports rather than structured observation.</p>
<p>To bridge this gap, a team of researchers from the University of Health Sciences in Istanbul, Türkiye, designed a prospective study. Lead author Yusuf Ezel Yıldırım, a psychiatrist at the Bakirkoy Prof. Dr. Mazhar Osman Training and Research Hospital for Psychiatric, Neurologic and Neurosurgical Diseases, sought to understand if this yawning was merely a nuisance or if it held clinical meaning. The team wanted to see if the frequency of yawning correlated with the severity of a patient’s depression or their quality of sleep.</p>
<p>The researchers recruited 150 adults aged 18 to 65 who were diagnosed with major depressive disorder. A key requirement for participation was that none of these patients had taken SSRIs before the study began. This exclusion ensured that any changes observed could be attributed to the new medication regimen rather than past usage. The participants were prescribed standard SSRI treatments such as sertraline, escitalopram, or fluoxetine.</p>
<p>Before taking their first dose, patients completed several detailed questionnaires. These surveys measured the severity of their depression using the Beck Depression Inventory and the intensity of their insomnia using the Insomnia Severity Index. The team also used a scale specifically designed to rate the frequency and disruptiveness of yawning. This custom assessment asked patients to rank their yawning from nonexistent to a level that severely impacted daily activities.</p>
<p>One month after starting treatment, the researchers followed up with the participants to assess their progress. Of the original group, 110 patients completed this second phase of the study. The researchers compared the new scores against the baseline data gathered four weeks earlier. They looked for patterns connecting the physical side effects to the psychological outcomes.</p>
<p>The results showed a clear general trend regarding the physical reaction to the drugs. The severity of yawning increased for the group as a whole after starting the medication. The number of patients reporting “excessive yawning” that frequently disrupted their lives jumped from roughly 5 percent to over 15 percent. This confirmed that the medication was indeed driving the physical behavior in a substantial portion of the group.</p>
<p>However, the data revealed a more nuanced relationship when the researchers looked at individual treatment outcomes. At the start of the study, patients with higher depression scores tended to report more yawning. This connection persisted even after accounting for age and other factors. It suggests that the act of yawning is biologically linked to the depressive state itself, not just the drugs.</p>
<p>The most distinct finding emerged when the team divided the patients into two groups based on how well the medicine worked. One group consisted of “responders,” defined as those whose depression scores dropped by at least half. The other group was “non-responders,” whose condition did not improve as markedly.</p>
<p>Among the patients who responded well to the treatment, yawning severity decreased slightly or stayed the same. In contrast, the non-responders experienced a sharp rise in yawning severity. The statistical analysis showed that this increase was not a random occurrence. It implies that if a patient continues to yawn excessively or if the yawning worsens markedly, it might indicate that the depression is not lifting.</p>
<p>The study also examined whether sleep issues played a role in this phenomenon. While insomnia scores generally improved for everyone, the changes in yawning happened independently of how well the patients slept. The yawning was more closely tied to other physical side effects like nausea, sweating, or dry mouth. This points to a reaction in the autonomic nervous system rather than simple fatigue or drowsiness.</p>
<p>This distinction is important because yawning is often misinterpreted in clinical practice. When a patient on antidepressants reports constant yawning, doctors often assume the patient is sedated or lethargic. This can lead to the mistaken belief that the patient is experiencing “asthenia,” a state of physical weakness and lack of energy.</p>
<p>If a clinician misinterprets this yawning as a sign of worsening depression or fatigue, they might increase the medication dosage. Based on the study’s findings, increasing the dose in a non-responding patient who is already yawning excessively might not address the root issue. The yawning may be a red flag that the current treatment path is ineffective for that specific individual.</p>
<p>Despite these insights, the study has limitations that affect how the results should be interpreted. The follow-up period lasted only one month. It remains unclear if the yawning persists, worsens, or resolves over a longer timeframe such as six months or a year. Additionally, the data relied entirely on patients reporting their own symptoms, which can introduce bias.</p>
<p>Future research needs to include objective measures of yawning to verify self-reports. Longer studies could track the trajectory of this symptom over extended periods to see if the body eventually adapts. The researchers also note that cultural factors can influence how people perceive and report bodily functions, which may affect data collection in different regions.</p>
<p>For doctors, the immediate takeaway is practical and applicable to daily monitoring. Persistent yawning should not be automatically dismissed as a sign of tiredness or boredom. Instead, it might serve as a subtle biological signal that the current treatment plan requires adjustment. By paying attention to this overlooked symptom, psychiatrists may be able to identify patients who are not responding to treatment much earlier.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.jpsychires.2025.10.075" target="_blank">Prevalence of SSRI-Related yawning and relationship with clinical features in patients with major depressive disorder: A prospective study</a>,” was authored by Yusuf Ezel Yıldırım, Eray Yurtseven, Pınar Çetinay Aydın, and Mehmet Güven Günver.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/how-widespread-is-internet-gaming-disorder-among-young-adults/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">How widespread is Internet Gaming Disorder among young adults?</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Jan 18th 2026, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A new comprehensive analysis published in the journal <em><a href="https://doi.org/10.1016/j.addbeh.2025.108576" target="_blank">Addictive Behaviors</a></em> suggests that Internet Gaming Disorder is a significant mental health concern for young adults, affecting approximately six percent of this population globally. The findings indicate that problematic gaming is not limited to adolescence but continues to impact individuals well into their twenties and thirties. The systematic review also provides evidence that the methods used to diagnose the condition heavily influence how widespread it appears to be.</p>
<p>The scientific community and the general public have historically viewed video game addiction as a problem primarily affecting children and teenagers. Educational campaigns and parental concerns usually focus on screen time limits for minors. However, the period known as young adulthood brings its own unique set of psychological and social challenges.</p>
<p>Young adulthood, typically defined as the ages between 18 and 35, involves major life transitions. Individuals in this age group are often pursuing higher education, entering the workforce, or seeking financial independence. These milestones can create instability and stress.</p>
<p>Previous research indicates that young adults may turn to gaming as a coping mechanism to manage the anxieties associated with these life changes. Despite this, prevalence estimates for gaming disorders in this demographic have varied wildly in past literature. The authors of the current study sought to resolve these inconsistencies by aggregating data from around the world.</p>
<p>“Most prevalence research on Internet Gaming Disorder (IGD) has focused on adolescents, while young adults are often treated as a residual or mixed group. However, this life stage is marked by significant transitions and vulnerabilities that may increase the risk of problematic gaming. We aimed to provide an updated, age-specific estimate of IGD prevalence in young adults and to clarify why prevalence figures vary so widely across studies,” said study author Claudio Longobardi, a professor at the University of Turin.</p>
<p>Internet Gaming Disorder is characterized by a pattern of persistent and recurrent gaming behavior that leads to clinically significant impairment or distress. Recognized as a condition for further study in the DSM-5-TR and an official diagnosis in the ICD-11, IGD is distinct from high-frequency gaming in that it involves a loss of control and negative life consequences. </p>
<p>Symptoms often mirror other addictive behaviors, including preoccupation with gaming, withdrawal symptoms like irritability when not playing, the development of tolerance requiring increased play time, and unsuccessful attempts to curb the behavior. For a behavior to be classified as IGD, it must typically result in the neglect of other interests, deception regarding the extent of gaming, or the jeopardizing of significant relationships, educational paths, or career opportunities.</p>
<p>The researchers aimed to provide a reliable estimate of how many young adults meet the criteria for Internet Gaming Disorder. They also sought to identify specific factors that might cause prevalence rates to fluctuate between different studies.</p>
<p>The research team conducted a systematic review and meta-analysis. This is a statistical method that combines the results of multiple independent studies to identify stronger patterns than any single study could show. They searched for relevant research articles published between 2015 and 2025.</p>
<p>The selection process ensured that only studies with original empirical data were included. The researchers focused specifically on samples of participants aged 18 to 35. They excluded studies that did not provide sufficient demographic details or those that focused solely on clinical populations already diagnosed with other conditions.</p>
<p>The final analysis incorporated data from 96 analytical samples found within 93 separate studies. The total number of participants across all these studies was 149,601. The average age of the individuals in these samples was approximately 23.5 years.</p>
<p>The gender distribution in the total pool of participants was relatively balanced. Women accounted for 51.22 percent of the sample. This allowed the researchers to examine potential gender differences in the prevalence of gaming problems.</p>
<p>The researchers used generalized linear mixed models to calculate the pooled prevalence of the disorder. This statistical approach helps to account for the variation and non-normal distribution of data often found in prevalence studies. It offers a more accurate estimate than simpler averaging methods.</p>
<p>The study determined that the overall pooled prevalence of Internet Gaming Disorder among young adults is 6.1 percent. This suggests that roughly one in every sixteen young adults may experience significant impairment due to their gaming habits. This rate is higher than some estimates for the general population.</p>
<p>The analysis revealed a clear distinction based on how researchers recruited their participants. Some studies focused exclusively on individuals who self-identified as gamers. In these gamer-only samples, the prevalence rate rose to 8.1 percent.</p>
<p>Other studies utilized mixed samples that included both gamers and non-gamers from the general population. In these broader groups, the prevalence was lower, estimated at 5.47 percent. This difference highlights the importance of context when interpreting statistics about gaming addiction.</p>
<p>“From a public health perspective, even a prevalence of 5–8% translates into a large number of affected individuals worldwide,” Longobardi told PsyPost. “The wide variability across studies also shows that how we measure IGD matters greatly. This has practical implications for screening, prevention, and policy decisions.”</p>
<p>The researchers also found that the specific questionnaire used to screen for the disorder had a major impact on the results. Instruments based on the criteria from the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5), tended to yield higher prevalence rates. Tools based on other frameworks, such as the ten-item Internet Gaming Disorder Test, resulted in lower estimates.</p>
<p>The DSM-5 criteria for the disorder include symptoms such as preoccupation with gaming and withdrawal symptoms when not playing. They also include the development of tolerance, meaning the need to play more to achieve the same satisfaction. Other signs involve unsuccessful attempts to quit and the loss of interest in other hobbies.</p>
<p>To receive a diagnosis under DSM-5 guidelines, an individual typically must meet five of nine specific criteria within a 12-month period. The severity of the condition is judged by how much it disrupts the person’s daily life. The variability in diagnostic tools suggests that the scientific community has not yet reached a complete consensus on how best to measure this condition.</p>
<p>Another key finding related to the size of the sample used in a study. The analysis showed that larger studies tended to report lower prevalence rates of the disorder. Smaller studies were more likely to report higher figures, possibly due to statistical noise or selection biases.</p>
<p>The researchers assessed the quality of the included studies using a risk-of-bias checklist. This tool evaluates factors such as whether the sample was representative and if the statistical analysis was appropriate. In samples composed entirely of gamers, studies with a high risk of bias reported significantly higher prevalence rates.</p>
<p>“One striking finding was how strongly prevalence estimates depended on the diagnostic instrument and study quality,” Longobardi said. “Studies with higher risk of bias or smaller samples tended to report much higher prevalence, which suggests that some widely cited figures may be overestimates.”</p>
<p>There was also a temporal trend observed within the gamer-only samples. The analysis indicated that prevalence estimates in these groups have increased over time. This suggests that problematic gaming may be becoming more common among enthusiasts as the years progress.</p>
<p>This increase could be linked to the changing nature of video games themselves. Modern games often include mechanics that encourage prolonged play, such as loot boxes and complex reward systems. The rise of competitive gaming and esports might also contribute to more intensive engagement.</p>
<p>The study examined potential differences based on geography. The samples came from diverse regions, including Asia, Europe, and North America. However, the analysis did not find statistically significant differences in prevalence based on the continent of origin.</p>
<p>This lack of regional variation might suggest that video gaming has become a truly globalized phenomenon. The mechanics of games and the culture surrounding them are shared across borders. This could lead to similar patterns of behavior regardless of where a person lives.</p>
<p>Gender also appeared to play a role, though the evidence was less definitive. There was a trend suggesting that samples with a higher proportion of women had lower prevalence rates. This aligns with historical data suggesting men are more frequently diagnosed with gaming disorders.</p>
<p>“Our results suggest that IGD affects a non-negligible proportion of young adults (around 6% overall, and over 8% among gamers) making it more common than many people assume for this age group,” Longobardi explained. “Gaming itself is not inherently problematic, but for a minority it can become a source of significant impairment. Awareness and early identification are therefore important, especially in educational and clinical settings.”</p>
<p>But readers should interpret these findings with an understanding of certain limitations. Most of the data included in this meta-analysis comes from self-report surveys. These are screening tools and do not constitute a formal clinical diagnosis made by a mental health professional. Self-reported data can sometimes lead to overestimation. Participants might misinterpret questions or exaggerate their symptoms, or some might downplay their behavior.</p>
<p>The study also noted a high degree of heterogeneity, or variability, between the results of the included studies. This means that even within the same category of people, different studies produced very different numbers. This variability suggests that unrecognized factors are still influencing the data.</p>
<p>There is also the potential for publication bias. The analysis detected asymmetry in the data that suggests studies with dramatic or positive results are more likely to be published. Studies finding no evidence of gaming disorder might remain in file drawers, skewing the public record.</p>
<p>“Our findings should not be read as implying that gaming is broadly harmful, but rather that a specific pattern of dysregulated gaming affects a vulnerable subgroup of young adults,” Longobardi noted. “Future work should focus on improving methodological consistency, aligning assessment tools with DSM-5-TR and ICD-11 criteria, and examining IGD in relation to comorbid mental health conditions. We are also interested in longitudinal studies to better understand developmental trajectories from adolescence into adulthood.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.addbeh.2025.108576" target="_blank">Prevalence of Internet gaming disorder in young adults: a systematic review and meta-analysis</a>,” was authored by Júlia Gisbert-Pérez, Claudio Longobardi, Manuel Martí-Vilar, Sofia Mastrokoukou, and Laura Badenes-Ribera.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>