<table style="border:1px solid #adadad; background-color: #F3F1EC; color: #666666; padding:8px; -webkit-border-radius:4px; border-radius:4px; -moz-border-radius:4px; line-height:16px; margin-bottom:6px;" width="100%">
<tbody>
<tr>
<td><span style="font-family:Helvetica, sans-serif; font-size:20px;font-weight:bold;">PsyPost – Psychology News</span></td>
</tr>
<tr>
<td> </td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/intelligence-predicts-progressive-views-but-only-after-college-2026-03-20/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Intelligence predicts progressive views, but only after college</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 21st 2026, 10:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People with higher childhood intelligence scores tend to express more socially progressive attitudes as adults, but this connection depends heavily on whether they attend college. A new study reveals that advanced education acts as a catalyst, prompting those with superior academic abilities to abandon conventional societal norms during their twenties. These findings were published in the <a href="https://doi.org/10.1177/01461672241273279">Personality and Social Psychology Bulletin</a>.</p>
<p>Past research has consistently linked enhanced cognitive ability with non-traditional social beliefs. Adults who score higher on intelligence tests generally show a greater willingness to question traditional social hierarchies. They also tend to resist dogmatism, which is the unyielding adherence to a strict set of beliefs without considering evidence or alternative opinions.</p>
<p>The developmental timeline of this relationship has remained unclear. Researchers wanted to know if highly intelligent children are inherently more open-minded from a young age. Alternately, outside experiences during early adulthood might be responsible for broadening their perspectives over time.</p>
<p>University of South Alabama psychologist Joshua Isen led a research team to investigate how these attitudes evolve. Isen and his colleagues suspected that exposure to higher education might serve as a moderating factor. In statistics, a moderating condition is a variable that influences the strength or direction of a relationship between two other concepts.</p>
<p>The investigators contrasted this idea against a mediating relationship. If education was simply a mediator, it would mean that intelligence causes people to go to college, and college then causes them to become more progressive. Under a moderation framework, the researchers proposed that intelligent individuals must be placed in a specific academic environment to fully realize their progressive leanings.</p>
<p>The researchers hypothesized that college environments actively encourage students to critically evaluate the existing social order. Students who possess stronger cognitive skills might process these lessons more deeply. These individuals represent the students who most effectively internalize the designated material of their professors. This enhanced capacity for academic absorption might predictably lead to a broader shift in worldview.</p>
<p>The researchers first looked at a cross-sectional sample of 3,291 middle-aged parents. These individuals were participating in the Minnesota Twin Family Study, a large project tracking the health and development of local families. The team gathered data on the parents’ intelligence scores, overall educational attainment, and adherence to conventional societal values.</p>
<p>The parents completed a questionnaire designed to measure their preference for strict moral standards and their respect for traditional authority. The questions asked participants about their views on obscenity, religious authority, and strict parental discipline. Higher scores on this assessment indicated a more conventional, rigid worldview. The researchers mapped these survey responses against the parents’ educational backgrounds and cognitive assessments.</p>
<p>In this older group, more years of schooling amplified the link between intelligence and progressive attitudes. Among parents who had attended college, higher cognitive ability strongly predicted a rejection of traditional norms. For parents who ended their education after high school, the connection between intelligence and their social attitudes was relatively weak.</p>
<p>To understand exactly how this ideological divergence happens, the team conducted a second study tracking 2,769 offspring from the same families. The researchers assessed the youth at age 17, before most had started college. They followed up with the participants twice more, at ages 24 and 29.</p>
<p>At each stage, the participants answered questions about their social beliefs and reported their educational progress. The researchers used statistical modeling to observe how each individual’s attitudes shifted across emerging adulthood. This strategy allowed them to capture actual developmental changes rather than relying on a single snapshot in time.</p>
<p>By tracking these developmental trajectories, the researchers observed varying personal outcomes based on educational pathways. Some individuals experienced rapid ideological shifts over a short duration, while others maintained steady beliefs throughout the entire testing period. This numeric variation provided the optimal data set to isolate the distinct impacts of academic exposure.</p>
<p>To guarantee that the survey questions carried the exact same meaning for teenagers as they did for adults, the researchers analyzed the psychological structure of the survey responses across different age brackets. They found that the questionnaire reliably measured a consistent set of beliefs regarding moral strictness at every stage of the timeline. This statistical consistency gave the research team confidence that they were tracking genuine ideological shifts.</p>
<p>At age 17, the association between intelligence and progressive attitudes was completely absent. In fact, teenagers who eventually enrolled in a four-year university started out slightly more aligned with traditional values than their peers. The researchers suggested that conventional teenagers might be more willing to conform to the expectations of teachers and parents, smoothing their path to college admission.</p>
<p>Attitudes began to diverge sharply as the participants moved through their twenties. For those who never attended college, traditional beliefs actually increased slightly as they aged into full adulthood. Their childhood intelligence scores had no measurable impact on how their social views changed over time.</p>
<p>The developmental trajectory looked very different for the college-educated participants. Those who pursued higher education became progressively less traditional between the ages of 17 and 29. The size of this ideological shift was tightly linked to their cognitive ability.</p>
<p>Students with higher intelligence scores experienced the vast majority of declines in conventional attitudes during their college years. The researchers found that the combination of college exposure and high intelligence predicted a robust shift toward progressive ideology. This effect scaled with educational attainment. The phenomenon appeared strongest among those who went on to graduate or professional schools.</p>
<p>The researchers considered whether this change resulted from faculty instruction or peer influence. They reasoned that if peer conformity were the primary driver, highly intelligent students would be less likely to align with their classmates. Prior cognitive research indicates that individuals with stronger intellectual abilities tend to show more resistance to peer persuasion. Because the brightest students exhibited the largest shifts, the social environment created by professors and college curricula likely played a direct role.</p>
<p>Alternatively, the researchers noted an explanation based on cultural institutions. In modern academic settings, advocating for social change carries immense cultural prestige. Intelligent individuals might simply be better equipped to recognize these ascendant cultural norms and adjust their surface beliefs accordingly.</p>
<p>The observational nature of the research means that the results cannot definitively establish that college attendance causes progressive attitudes. Intelligence shapes the specific type of college environment a student experiences. Highly capable students might select themselves into more rigorous academic programs or attend institutions with a more pronounced progressive campus culture.</p>
<p>Other major life events occurring in a person’s twenties could also influence social attitudes. People who skip college often marry and have children at a younger demographic age. These early family responsibilities might independently foster an adoption of more conventional social values.</p>
<p>An additional detail of the research concerns the specific terminology used by the testing materials. The assessment of traditionalism focused heavily on private conduct rather than public policy or state coercion. Because the questions evaluated personal rule-following rather than political hostility, the results might shift if researchers applied a varied measure of ideological intolerance in future testing.</p>
<p>The study also focused heavily on a single region of the United States. The participants were predominantly white individuals from the Upper Midwest. Future investigations will need to replicate the findings using more ethnically and geographically diverse populations.</p>
<p>The researchers plan to explore how emotional abilities factor into ideological development during college. Traits like delayed gratification might help capable students engage more consistently with challenging coursework. Additional data on specific college majors could also help clarify which academic environments most effectively reshape social perspectives.</p>
<p>The study, “<a href="https://doi.org/10.1177/01461672241273279" target="_blank">Is Progressive Ideology on the Test? Education and Intelligence in the Development of Nontraditional Attitudes</a>,” was authored by Joshua D. Isen, Steven G. Ludeke, Timothy F. Bainbridge, Matt K. McGue, and William G. Iacono.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/primary-dysmenorrhea-severe-menstrual-pain-is-associated-with-lower-cognitive-and-daily-functioning/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Primary dysmenorrhea: Severe menstrual pain is associated with lower cognitive and daily functioning</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 21st 2026, 08:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>New research published in the <em><a href="https://doi.org/10.1016/j.ejogrb.2026.114965" target="_blank">European Journal of Obstetrics & Gynecology and Reproductive Biology</a></em> suggests that severe menstrual pain impacts much more than physical comfort. The findings provide evidence that women experiencing painful periods also face challenges with attention, thinking speed, self-esteem, and their ability to perform daily tasks. This research indicates that menstrual pain shapes how people function in their everyday lives, including at school or work.</p>
<p>Primary dysmenorrhea is a medical term for severe, recurring menstrual cramps that are not caused by an underlying disease or pelvic abnormality. The condition is extremely common among young women. It frequently causes lower abdominal pain that radiates to the thighs and peaks during the first two days of menstruation.</p>
<p>Scientists believe this intense discomfort is driven by an overproduction of prostaglandins. These are hormone-like chemicals that trigger strong uterine contractions. High levels of these chemicals can cause additional symptoms like nausea and fatigue alongside the primary abdominal pain.</p>
<p>Scientists noted that existing research on this topic tends to be highly fragmented. Past studies usually focused on single aspects of menstrual pain, such as the severity of the cramps or the psychological distress it causes. Few scientific efforts had evaluated how physical pain, mental skills, and emotional well-being interact over the course of a full menstrual cycle.</p>
<p>The researchers wanted to fill this gap by tracking these different factors together across distinct biological phases. They aimed to understand how variations in the menstrual cycle affect occupational performance. In occupational therapy, occupational performance refers to a person’s ability and satisfaction in carrying out meaningful daily roles, like studying, working, or socializing.</p>
<p>“Many young women say things like ‘I just can’t focus,’ or ‘I don’t feel like myself’ during certain days of their cycle especially when they have menstrual pain,” said study author Gokcen Akyurek, an associate professor at Hacettepe University. “Despite how common this is, these experiences are often minimized or seen as ‘just part of being a woman.’ We wanted to understand whether these changes are real, measurable, and how they affect everyday life not just physically, but cognitively and emotionally.”</p>
<p>To examine these factors, the scientists recruited 138 young women between the ages of 17 and 25 years. The sample included 79 women who experienced primary dysmenorrhea and 59 asymptomatic women who did not experience menstrual pain. Physicians evaluated the participants to ensure their pain was not caused by secondary issues like endometriosis.</p>
<p>The researchers evaluated the participants at three distinct points in their menstrual cycles. These points included the first three days of bleeding, the mid-follicular phase roughly a week after menstruation begins, and the mid-luteal phase about a week before the next period starts.</p>
<p>During each session, participants completed several standardized questionnaires. These surveys measured pain intensity, body awareness, self-esteem, and attitudes toward menstruation. Participants also rated their own occupational performance and satisfaction to gauge how well they felt they were managing their daily routines.</p>
<p>In addition to the surveys, the researchers administered cognitive tests to measure mental sharpness. They used a well-known psychological assessment that measures selective attention and the ability to control impulsive responses. This test requires participants to look at color words printed in conflicting ink colors and correctly name the ink color rather than read the written word.</p>
<p>They also used an auditory addition task to evaluate how fast participants could process new information. In this test, participants listen to a series of numbers and must quickly add each new number to the one they heard just before it. This measures working memory, which is the brain’s ability to hold and manipulate information over short periods.</p>
<p>The results showed consistent differences between the two groups of women. Those with severe menstrual pain reported a lower body mass index, which is a measure of body fat based on height and weight. Researchers note that having adequate body fat is essential for hormonal regulation, and imbalances can increase the chemicals responsible for menstrual pain.</p>
<p>Women in the pain group also expressed more negative beliefs about menstruation, often viewing it as a debilitating event. They demonstrated lower self-esteem compared to the women without menstrual pain. This reduction in self-esteem and daily functioning was present across all phases of the menstrual cycle, not just during bleeding.</p>
<p>This persistent decline suggests that the psychological and functional toll of severe menstrual cramps extends beyond the days of actual physical pain. Mental performance also fluctuated depending on the timing of the menstrual cycle. For women with primary dysmenorrhea, attention and information processing speed declined significantly during the luteal phase.</p>
<p>The luteal phase occurs just before menstruation and involves significant hormonal changes, such as elevated progesterone. These hormonal shifts might interact with the anticipation of pain to cause temporary cognitive fatigue. By contrast, the women without menstrual pain did not experience these same mental declines.</p>
<p>Both groups reported that their occupational performance and body awareness hit their lowest points during the actual days of menstruation. Body awareness is the ability to recognize and understand internal physical sensations. When women experience high pain, they might detach from their bodily signals as a coping mechanism.</p>
<p>The women with severe cramps consistently rated their ability to manage daily life much lower than the asymptomatic women did. Statistical analysis showed that menstrual pain and negative attitudes were strong predictors of lowered self-esteem and reduced functionality. These findings highlight that primary dysmenorrhea creates a complex web of physical and emotional challenges.</p>
<p>“What surprised us most was how consistent and widespread the impact was,” Akyurek told PsyPost. “Women with menstrual pain didn’t just report feeling worse; they actually performed worse on cognitive tasks and reported lower confidence and daily functioning, especially during certain phases of the cycle.”</p>
<p>“Our findings show that it can affect attention, thinking speed, confidence, and even how well someone performs daily tasks like studying or working. In other words, it’s not just discomfort it can shape how people function in their daily lives.”</p>
<p>While this research provides a broad view of menstrual health, it does have some limitations to keep in mind. The study relied on participants self-reporting their menstrual phases rather than using blood tests to verify exact hormone levels. The scientists also did not control for outside factors like sleep quality, nutrition, or physical activity, which can all influence pain and mental sharpness.</p>
<p>The participants were largely university students, meaning the results might not fully apply to women in different age groups or educational backgrounds. Because the study compared groups at specific points in time, it cannot definitively prove that menstrual cramps directly cause these cognitive and emotional changes. Other unmeasured factors, such as underlying anxiety or stress, might play a role in how pain is experienced.</p>
<p>Future research will likely explore practical strategies to help individuals manage these symptoms. The scientists hope to develop targeted interventions that assist young women with painful periods.</p>
<p>“Our next goal is to move beyond understanding the problem and start developing solutions, especially practical strategies to help individuals manage these challenges in daily life, school, and work,” Akyurek said.</p>
<p>“One important message is that these experiences are real and measurable. When we start to recognize menstrual health as something that affects daily functioning “not just pain” we can create more supportive environments in education, workplaces, and healthcare.”</p>
<p>The study, “<a href="https://doi.org/10.1016/j.ejogrb.2026.114965" target="_blank">Neurocognitive function, psychosocial characteristics, and occupational performance across menstrual phases in young adults with and without primary dysmenorrhea</a>,” was authored by Aysenur Karakus, Semanur Inanc, and Gokcen Akyurek.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/neuroscientists-just-upended-our-understanding-of-pavlovian-learning/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Neuroscientists just upended our understanding of Pavlovian learning</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 21st 2026, 06:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in <em><a href="https://doi.org/10.1038/s41593-026-02206-2" target="_blank" rel="noopener">Nature Neuroscience</a></em> suggests that the brain learns to associate a specific signal with a reward based on the amount of time that passes between rewards, rather than the sheer number of repetitions. This challenges a century-old assumption about conditioning, providing evidence that total learning over a given period depends entirely on timing. These findings could shift our understanding of both animal and human learning.</p>
<p>For over a hundred years, scientists have generally accepted that associative learning operates through trial and error. Associative learning is the process by which a human or animal learns to link a specific signal with a specific outcome, like a dog learning that a bell means dinner is ready. The prevailing thought has been that more practice leads to better learning.</p>
<p>Scientists previously developed a mathematical model suggesting that animals learn by looking backward in time to identify the causes of meaningful effects. In this framework, the brain does not try to predict the future effects of a cue, but rather works backward from a reward to figure out what predicted it. While testing this idea, scientists noticed that animals learned proportionally faster when the time between rewards was extended.</p>
<p>“We had realized soon after publishing this paper that this model predicts that animals will learn cue-reward associations proportionally faster when the trials are spaced out, which should mean that over a fixed duration, total learning is independent of the number of experienced cue-reward pairings,” said study author <a href="https://profiles.ucsf.edu/vijaymohan.knamboodiri" target="_blank" rel="noopener">Vijay Mohan K. Namboodiri</a>, an associate professor at UC San Francisco.</p>
<p>This observation prompted researchers to test whether a strict mathematical rule governs the rate of learning. They aimed to determine if learning speeds up proportionally in relation to the time elapsed between cue and reward experiences. They designed a series of experiments to measure both physical behavior and brain chemistry in real time.</p>
<p>“We set out to test whether there is a rule governing learning rate control and whether learning rate scales proportionally with the time between cue-reward experiences,” Namboodiri explained.</p>
<p>The researchers conducted their study using 101 adult male and female mice. They classically conditioned thirsty mice by playing a brief auditory tone followed by the delivery of sugar-sweetened water. The mice were physically held in a fixed position, ensuring the testing conditions were controlled and uniform across all subjects.</p>
<p>As the mice learned the association, they would begin licking the water spout as soon as they heard the tone, anticipating the sugar water. To measure the underlying brain activity, the researchers used a technique called fiber photometry. They injected a special fluorescent sensor into the nucleus accumbens core, a brain region heavily involved in processing rewards.</p>
<p>This sensor lit up when the brain released dopamine, a chemical messenger strongly linked to pleasure, motivation, and learning. This allowed the scientists to monitor exactly when the brain processed the tone and the reward. The researchers divided the mice into different groups based on how much time passed between the trials. Some mice experienced the tone and reward every 60 seconds, while others waited 600 seconds between pairings.</p>
<p>The mice that waited 600 seconds learned the association in about one-tenth the number of trials compared to the mice on the 60-second schedule. This indicates a proportional relationship where the rate of learning per trial increases as the time between rewards increases. As a result, both groups of mice learned the association in the exact same amount of total conditioning time, despite one group experiencing far fewer total tone and reward pairings.</p>
<p>“The main finding of the study, that learning rate (how much is learned from each experience) scales proportionally with the time between rewards was very surprising,” Namboodiri told PsyPost. “While it was a prediction made by our retrospective learning model mentioned above, we expected our initial experiments to falsify that prediction and necessitate an update to the model.”</p>
<p>The dopamine measurements provided evidence matching the behavioral observations. In the mice with longer gaps between rewards, the brain required proportionally fewer experiences before it started releasing dopamine in response to the tone alone. The dopamine response actually emerged a few trials before the mice started physically licking the spout in anticipation.</p>
<p>“Trial by trial, we tracked how dopamine responses to cues evolved during learning under the same timing manipulations we used behaviorally,” Namboodiri said. “We found that dopamine signals followed the same learning rule: the rate and magnitude of changes in dopamine cue responses depended on the average time between rewards, not on the raw number of cue–reward pairings. This parallel between behavior and dopamine activity shows that the brain’s reward system implements a time‑based learning rule, revealing a simple biological underpinning for how animals learn from rewards.”</p>
<p>To ensure their results were not caused by other factors, the scientists ran several control experiments. They tested whether the mice simply learned faster because they received fewer rewards per day, which might make the sugar water seem more novel.</p>
<p>The researchers also tested whether spending more time in the testing chamber without hearing tones played a role. Even when controlling for these variables, the proportional scaling rule remained consistent. The time between rewards consistently dictated the speed of learning per trial.</p>
<p>The scientists then tested aversive learning by pairing a tone with a mild foot shock in freely moving mice. They observed the same proportional scaling rule in this scenario. Mice with longer times between shocks learned to freeze in response to the tone in proportionally fewer trials.</p>
<p>In another variation, researchers tested partial reinforcement. They played the tone every 60 seconds but only gave the sugar water 10 percent to 50 percent of the time. Because the actual rewards were spaced further apart in time, the mice learned the underlying dopamine association in far fewer rewarded trials than mice receiving rewards every single time.</p>
<p>Traditional theories of learning assume the brain calculates a prediction error on a moment-by-moment basis. A prediction error is the difference between the reward an animal expects and the reward it actually receives. The researchers compared these older models against their newer framework that calculates associations by looking backward in time only when a reward is received.</p>
<p>When running computer simulations of these different theories, the traditional models failed to match the behavior of the mice. The traditional models could not explain why learning rates scaled proportionally with the time between rewards. The newer backward-looking model naturally predicted this exact proportional scaling, providing strong theoretical support for the experimental findings.</p>
<p>“A key takeaway from our study is that that what really drives reward‑based learning is how much time passes between rewards, not how many cue–reward pairings an animal experiences,” Namboodiri summarized. “In simple terms, we found that when rewards are spaced farther apart in time, each individual reward leads to proportionally greater learning. Thus, if rewards occur ten times farther apart, each reward leads to roughly ten times more learning.”</p>
<p>“As a result, when you look over a fixed amount of time, the total amount of learning ends up the same despite vastly different number of cue-reward experiences (over a 20-fold range). This previously unknown learning rule suggests that the total number of experiences is not the key determinant of learning, which challenges some longstanding assumptions in neuroscience and reinforcement learning. The field had known that spreading pairings out in time speeds up learning per pairing, but it was still assumed that the final level of learning depended on the total number of pairings. Our experiments showed that, instead, total learning is determined by time, not count.”</p>
<p>Readers might easily confuse these findings with the well-known spacing effect. The spacing effect is a broad educational concept suggesting that taking breaks between study sessions yields better learning than cramming. The new research points to something much more specific than a general benefit of taking breaks.</p>
<p>“We would like to highlight that our results are not simply a restatement of the spacing effect or its biological underpinnings, but instead that we have identified a previously unknown rule of learning,” Namboodiri told PsyPost. “The spacing effect can be summarized in broad terms as ‘spacing out experiences = better learning,’ which implies that when experiences are closer together in time, there are diminishing returns on their contribution to learning.”</p>
<p>“However our findings that learning rate scales proportionally with the time between rewards (rewards, specifically) require a fundamental shift from the above perspective because it necessitates (as we show) that over a fixed amount of time, number of cue-reward experiences has NO impact on overall learning.”</p>
<p>One potential limitation is that the researchers tested this specific rule primarily in simple conditioning setups using mice. They also noted that the proportional scaling rule tends to break down at extreme intervals, such as when mice waited an entire hour between rewards.</p>
<p>Future research will explore where exactly in the brain this time duration is calculated. Scientists also plan to investigate whether this rule applies to drug rewards, which could offer insights into addiction and habit formation. Because nicotine patches deliver a constant stream of nicotine, for example, they might disrupt the brain’s association between the act of smoking and the reward, blunting the urge to smoke.</p>
<p>Applying these timing principles to artificial intelligence systems might also help machines learn much faster from fewer pieces of data. Current systems learn slowly because they make tiny refinements after billions of interactions. A model borrowing from these new biological findings could potentially accelerate artificial learning.</p>
<p>The study, “<a href="https://doi.org/10.1038/s41593-026-02206-2" target="_blank" rel="noopener">Duration between rewards controls the rate of behavioral and dopaminergic learning</a>,” was authored by Dennis A. Burke, Annie Taylor, Huijeong Jeong, SeulAh Lee, Leo Zsembik, Brenda Wu, Joseph R. Floeder, Gautam A. Naik, Ritchie Chen & Vijay Mohan K Namboodiri.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/poor-sleep-quality-not-duration-linked-to-slower-daily-brain-function-in-older-adults/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Poor sleep quality, not duration, linked to slower daily brain function in older adults</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 20th 2026, 22:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>An analysis of the Einstein Aging Study data showed that older adults who experience longer nighttime wakefulness tend to have slower processing speed, worse working memory, and worse visual memory binding. On an individual level, participants’ processing speed was slower after nights with greater-than-usual nighttime awakenings. The research was published in <a href="https://doi.org/10.1016/j.sleh.2025.11.010"><em>Sleep Health: Journal of the National Sleep Foundation</em></a>.</p>
<p>Sleep is essential for physical health, cognitive functioning, and emotional regulation. During sleep, the body repairs tissues, strengthens the immune system, and regulates hormones involved in appetite, stress, and growth. Adequate sleep supports memory consolidation and learning by helping the brain process and organize information. Poor or insufficient sleep is linked to increased risk of cardiovascular disease, obesity, diabetes, depression, and impaired immune function.</p>
<p>Sleep quality is determined by examining how well and how continuously a person sleeps. Key indicators of good sleep quality include short sleep onset latency (falling asleep easily), low wake after sleep onset (minimal time awake during the night), and high sleep efficiency (most time in bed spent asleep). Feeling rested and alert during the day is another important subjective indicator of good sleep. On the other hand, frequent awakenings, long periods of nighttime wakefulness, and excessive daytime sleepiness suggest fragmented or poor-quality sleep.</p>
<p>Study author Orfeu M. Buxton and his colleagues wanted to examine the short-term associations between sleep and daily cognitive performance in older individuals without dementia. They hypothesized that people experiencing longer nighttime wakefulness, later sleep midpoints, more napping, and shorter sleep durations would tend to have worse overall cognitive performance. On an individual level, the authors hypothesized that a person’s cognitive performance would be worse than average on the days following nights when their sleep was more fragmented (i.e., more nighttime awakenings).</p>
<p>The study authors analyzed data from the Einstein Aging Study. This study recruited participants from Bronx County, NY, who were community-residing, English-speaking, 70 years of age or older, free of dementia, and without significant hearing or vision loss or severe psychiatric symptoms that might interfere with cognitive testing. They also had no current alcohol or substance abuse issues or treatment for cancer within the last 12 months.</p>
<p>This analysis used data from 261 participants. Their average age was 77.2 years, and 67% were women. Over 16 days, these individuals wore wrist actigraphs and completed cognitive assessments six times daily using smartphones provided by the study. An actigraph is a small wearable device, usually worn on the wrist, that measures movement over time to estimate sleep–wake patterns and overall rest–activity cycles.</p>
<p>The cognitive assessments consisted of four different tasks. They were designed to assess visuospatial working memory, processing speed, visual working memory (item-location binding), and visual working memory (intra-item feature binding). On average, the tasks took a total of 4–5 minutes to complete. Participants also completed a single night of home pulse oximetry (i.e., blood oxygen saturation and pulse rate measurements conducted during the night).</p>
<p>Results showed that participants who generally experienced longer nighttime wakefulness (wake after sleep onset) tended to have slower processing speed, worse working memory, and worse visual memory binding (a worse ability to integrate visual features into the memory of an object or scene). On an individual level, after nights with greater-than-usual nighttime wakefulness, participants tended to have below-average processing speed the next day.</p>
<p>Sleep duration, timing, and naps were not associated with performance on the cognitive tests.</p>
<p>“The results demonstrate short-term effects of sleep fragmentation (WASO) on processing speed the next day in dementia-free older adults. Better understanding short-term effects might identify individuals who may benefit from early interventions to prevent long-term cognitive decline,” the study authors concluded.</p>
<p>The study contributes to the scientific understanding of the consequences poor sleep quality has on cognitive performance. However, the participants in this study were all older adults. Results in other age groups might differ.</p>
<p>The paper, “<a href="https://doi.org/10.1016/j.sleh.2025.11.010">Within- and between-person associations of sleep characteristics with daily cognitive performance in a community-based sample of older adults,</a>” was authored by Orfeu M. Buxton, Qi Gao, Jonathan G. Hakun, Linying Ji, Alyssa A. Gamaldo, Suzanne M. Bertisch, Martin J. Sliwinski, Cuiling Wang, and Carol A. Derby.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/happier-people-live-longer-even-in-cultures-that-value-emotional-restraint/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Happier people live longer, even in cultures that value emotional restraint</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 20th 2026, 20:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Happier Japanese adults live longer, according to a new study published in <em><a href="https://doi.org/10.1037/hea0001571" target="_blank" rel="noopener">Health Psychology</a></em>, which found that people who described themselves as unhappy faced a significantly higher risk of death over a seven‑year period.</p>
<p>Happiness has long been linked to better health, but most of the evidence comes from Western countries. Researchers have questioned whether the same patterns hold in cultures where emotional expression is more restrained and where happiness may be defined differently. In Japan, for example, happiness is often associated with calmness and social harmony rather than excitement or personal achievement. Understanding whether happiness predicts longevity in such contexts helps clarify whether the link is universal or culturally specific.</p>
<p>The research team, led by Akitomo Yasunaga of Aomori University of Health and Welfare, set out to determine whether happiness truly protects health—or whether the association disappears once factors like age, income, education, and physical health are taken into account. Previous studies have raised the possibility that unhappy people may simply be unhealthier to begin with, falsely suggesting that unhappiness shortens life when poor health is the real cause.</p>
<p>To investigate this, the researchers followed 3,187 adults (aged 20 and older) living in Minami‑Izu, a rural town in Japan, from 2016 to 2023. At the start of the study, participants answered a simple question: “How happy do you think of yourself at present?”</p>
<p>Participants originally answered on a four-point scale, but because very few people reported negative emotions, the researchers merged the bottom two categories. This placed participants into one of three final groups: happy (31.5%), somewhat happy (60.8%), or unhappy (7.7%). The team also collected information on education, marital status, economic situation, body mass index, and physical functioning. Over the next seven years, deaths were tracked using official city records.</p>
<p>By the end of the study, 277 participants had died. The researchers found a clear pattern: people who reported being unhappy at the beginning of the study were significantly more likely to die during the follow‑up period. Even after adjusting for age, sex, socioeconomic status, and health measures, the unhappy group had an 85 percent higher risk of death compared with those who said they were happy.</p>
<p>The findings remained consistent even when the researchers excluded participants who died within the first year, reducing the likelihood that pre‑existing, terminal illnesses explained the results.</p>
<p>Yasunaga and his team concluded, “the consistency of our findings with the international literature suggests that, despite potential cultural nuances in how happiness is experienced or expressed, its protective association with mortality may reflect a more universal phenomenon.”</p>
<p>Still, the authors caution that the study has limitations. Happiness was measured with a single question, which cannot capture the full complexity of emotional well‑being. Additionally, health status was assessed using self‑reported measures, which may be less accurate than clinical evaluations. Crucially, the study did not control for lifestyle habits—such as smoking, alcohol intake, diet, and physical activity—which could influence both a person’s happiness and their risk of mortality.</p>
<p>The study, “<a href="https://doi.org/10.1037/hea0001571" target="_blank" rel="noopener">Association of State Happiness With Mortality: Evidence From a Prospective Cohort Study in Japan</a>,” was authored by Akitomo Yasunaga, Ai Shibata, Yoshino Hosokawa, Mohammad Javad Koohsari, Rina Miyawaki, Kuniko Araki, Kaori Ishii, and Koichiro Oka.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/why-a-widely-disliked-personality-trait-might-actually-protect-your-mental-health/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">Why a widely disliked personality trait might actually protect your mental health</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 20th 2026, 18:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>Narcissism is often viewed purely as a toxic personality trait, but it actually contains different elements that can either protect or harm a person’s mental well-being. A recent review of hundreds of previous studies found that while certain insecure forms of narcissism are linked to anxiety and depression, the more confident and outgoing forms are associated with higher self-esteem and life satisfaction. The research, published in the <em><a href="https://doi.org/10.1111/jopy.70044" target="_blank" rel="noopener">Journal of Personality</a></em>, helps clarify how different types of self-centered traits impact psychological health.</p>
<p>Rongxia Hou, a psychology researcher at Hunan Normal University in China, led the investigation. The research team included colleagues from Hunan Normal University, the University of Georgia, and Purdue University. Previous investigations into narcissism and psychological health have produced confusing results. Some papers suggested the trait gave people mental toughness and life satisfaction, while other papers linked it to deep psychological distress, loneliness, and depression.</p>
<p>Hou and the team wanted to clear up this confusion. They suspected that previous broad generalizations might be hiding the true relationships between specific personality traits and distinct mental health outcomes. To separate these concepts, they relied on a dual-factor model of mental health. In the past, mental health was largely defined as the simple absence of psychiatric illness. Today, psychology researchers view mental well-being and psychological distress as related but separate dimensions, meaning that improving one aspect does not automatically reduce the other.</p>
<p>Positive mental health includes positive emotions, life satisfaction, and high self-esteem. Negative mental health involves inward-facing struggles. These are often called internalizing problems, which refer to emotional distress that people direct at themselves, such as anxiety, stress, and depression.</p>
<p>Narcissism is similarly divided into distinct categories. The most common framework splits the personality trait into grandiose and vulnerable dimensions. Grandiose narcissism involves outgoing, confident, showy, and sometimes aggressive behavior. People with this trait often believe they are inherently superior to others.</p>
<p>Vulnerable narcissism involves a very different presentation. This version of the trait is marked by deep insecurity, defensiveness, and a tendency to withdraw from social situations. Both versions share a core foundation of entitlement and self-absorption.</p>
<p>To figure out how these personality traits interact with mental health, Hou and the team performed a meta-analysis. A meta-analysis is a statistical technique that combines data from many independent studies to look for large-scale patterns. By pooling data, researchers can identify broad trends that might be hidden in smaller individual studies.</p>
<p>The researchers gathered 229 published and unpublished studies spanning nearly four decades of research. This combined dataset included information from more than 185,000 participants. The participants ranged in age from young children to adults in their fifties.</p>
<p>The team separated the collected data based on the type of narcissism and the specific type of mental health outcome. They then used statistical models to estimate the overall associations between the personality traits and mental health indicators. They also checked if variables like age, national culture, or testing methods altered the results. To assess cultural impacts, they assigned each sample a national individualism score, which measures whether a society prioritizes personal success over group harmony.</p>
<p>The researchers found that grandiose narcissism was linked to better positive mental health. People scoring high in grandiose narcissism reported greater life satisfaction, more positive emotions, and higher self-esteem. They also exhibited greater personal resilience when dealing with stress.</p>
<p>When looking at inward-facing struggles like anxiety or depression, grandiose narcissism had no clear effect. The results were not statistically significant for most negative mental health categories. The only negative outcome firmly linked to grandiose narcissism was a higher rate of compulsive social media use. This specific outcome likely stems from a desire for social recognition and public self-presentation.</p>
<p>Vulnerable narcissism showed the exact opposite pattern. It was linked to lower levels of positive mental health across the board. It also had a strong connection to higher rates of depression, anxiety, loneliness, and stress.</p>
<p>To understand why grandiose narcissism seemed to offer psychological protection, the team split grandiose narcissism into two distinct behaviors known as admiration and rivalry. Admiration involves seeking praise through charm, striving for uniqueness, and showcasing success. Rivalry involves protecting the ego by putting others down, displaying hostility, and viewing other people as competitors.</p>
<p>The desire for admiration acted as an emotional shield, predicting higher happiness and lower distress. Rivalry was linked to lower positive mental health and higher negative distress.</p>
<p>The researchers also applied a modern three-factor model of narcissism to explain their results. This framework breaks narcissism into three components, which are agentic extraversion, antagonism, and neuroticism. Agentic extraversion refers to a person’s assertiveness, social boldness, and desire for leadership.</p>
<p>Antagonism captures hostility, entitlement, and a tendency to manipulate others. Neuroticism refers to emotional instability, hypersensitivity to rejection, and chronically low self-worth.</p>
<p>The team found that agentic extraversion was the primary source of healthy outcomes. This outgoing assertiveness supports psychological resilience and subjective well-being. Antagonism and neuroticism were the primary drivers of unhealthy outcomes.</p>
<p>Because grandiose narcissism is heavily defined by agentic extraversion, it often leads to positive mental health outcomes. Vulnerable narcissism is heavily defined by neuroticism and antagonism. The combination of intense emotional instability and social hostility explains why vulnerable narcissism consistently predicts poorer psychological outcomes. Highly vulnerable individuals tend to frequently recall unpleasant past events, fixate on emotional pain, and withdraw socially, which strips them of community support.</p>
<p>The study also revealed that age changed some of these patterns. The link between vulnerable narcissism and negative mental health grew stronger in older populations. The researchers suggest that as people with vulnerable narcissism age, their hypersensitivity and social insecurities might lead to repeated interpersonal failures.</p>
<p>These repeated relationship issues likely create an accumulating burden of anxiety and depression. A person’s environment and the type of survey they took also influenced the outcome. Global assessments of personality traits captured the broader adaptive components of narcissism, while highly specific surveys sometimes isolated hostile features. Cultural individualism did not alter the associations in a meaningful way.</p>
<p>The study does have some limitations. Most of the analyzed data came from self-reported surveys. This means the results rely on participants accurately assessing their own personalities and mental states. Self-perception is not always reliable, especially for people with highly self-centered traits.</p>
<p>The participants in the original studies were mostly from convenience samples. These included accessible groups like university students or online survey takers. This narrow sampling might limit how well the results apply to the general public across different demographics.</p>
<p>The investigation focused heavily on inward-facing mental health issues. It completely excluded outward-facing problems, which researchers call externalizing psychopathology. These outward problems include physical aggression, rule-breaking, and reckless behavior. Previous studies have shown that grandiose narcissism is strongly connected to these outward issues.</p>
<p>Future research should include a wider variety of mental health outcomes to capture a complete picture of psychological functioning. The authors suggest that coding outcomes across all domains of human behavior will clarify how these traits affect society. They also recommend using alternative testing methods, such as observing actual behavior or measuring physiological stress responses.</p>
<p>The study, “<a href="https://doi.org/10.1111/jopy.70044" target="_blank" rel="noopener">Weapon or Armor? Unpacking the Paradox of Narcissism and Self-Reported Mental Health Through a Three-Level Meta-Analysis</a>,” was authored by Rongxia Hou, Shuqin Li, Joshua D. Miller, Donald R. Lynam, and Yanhui Xiang.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-research-reveals-why-storytelling-works-better-than-bullet-points-in-online-dating/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New research reveals why storytelling works better than bullet points in online dating</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 20th 2026, 16:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p><strong>You’ve been on the dating market for months, wondering how to catch the right partner’s attention. New research shows that how you tell your story may matter as much as what you say about yourself. If you feel like you’re doing everything “right” and still not getting traction, this might be why.</strong></p>
<p><em>“6’1. Outdoorsy. Ambitious. Sarcastic. Dog dad. Gym-ish. Traveler. Foodie. Fluent in banter. Good vibes only.”</em></p>
<p>When I was dating and scrolling through profiles, adjective lists like this were an instant turn-off. They felt strangely cold, like I was reading a car-for-sale listing instead of meeting a person. At the time, it was just a gut reaction. It took me several years (and three studies)<sup>1</sup> to understand why that “selling” strategy backfires and what works better instead.</p>
<p>Ironically, even marketing companies, including those selling cars, have long understood the limitations of listing features. They know that telling a story around their brand or product increases consumers’ curiosity, engagement, and emotional connection. You don’t just buy a speedy and safe car; you buy a legacy that protects your family on every journey.</p>
<p><strong>Why bullet points kill attraction</strong></p>
<p>From cradle to grave, we are enchanted by good stories. Think about the last time someone told you a really good one: you leaned in. For a moment, you forgot yourself. You were suddenly “there,” in their world, not yours. In that immersive state, you don’t just <em>understand</em> what’s happening, you <em>feel</em> it. You start caring about the characters, worrying about their fate, and wanting to know what happens next.<sup>2</sup> That “leaning in” feeling is exactly what bullet points rarely create.</p>
<p>Beyond being entertaining, stories also shape judgments.<sup>3</sup> When we identify with a character and feel part of the story’s world, we process information in a more experiential (rather than analytical) way. And that makes us far more receptive to what we’re hearing compared to a sequence of facts or bullet points alone.<sup>4</sup> Stories don’t just inform us; they make us <em>feel</em> something. And feelings drive decisions.</p>
<p>It therefore won’t surprise you to learn that storytelling has been adopted as a marketing tool. Research has backed these insights, showing that with the help of intriguing stories, brands can set themselves apart from others. In this way, they can easily capture potential customers’ attention and spark desire for the brand or its products, ultimately increasing the likelihood that people choose them over the alternatives.<sup>2</sup></p>
<p><strong>We wanted to see if what works in car markets works in the dating kingdom. </strong></p>
<p>So we asked a simple question: if stories sell cars, can they also make someone want to meet you?</p>
<p>In three studies<sup>1</sup>, we showed single participants dating profiles that presented either narrative or non-narrative self-presentations of a potential partner. After viewing the profile, participants reported their empathy for and romantic interest in the profiled individual (the potential partner).</p>
<p><strong>Study 1: Same facts, different <em>format</em></strong></p>
<p>In the first study, the narrative and non-narrative self-presentations contained identical basic information. The key difference wasn’t <em>what</em> the potential partner revealed; it was <em>how</em> they revealed it.</p>
<p><strong>Profile A: Non-narrative (just the facts) </strong></p>
<p>“Dan. I come from a world of art. I learned to play guitar from a young age. My first guitar was a gift from my grandfather. Even today, I play quite a lot. It’s my main hobby. After graduating from high school, I traveled to South America with friends. This trip was long and unforgettable. I tried many different types of sports, extreme activities, and also experienced various foods. At different points in my life, especially during the trip, I met many people with whom I had some meaningful conversations. As I was preparing to return from South America, I enrolled in studies for a bachelor’s degree in economics at the university. Today I am an economics student. At the same time, I work in the high-tech industry. In my free time, I run and hike with my friends and my family. At the end of the day, I drink a glass of wine and play the guitar. I would love to get to know you.”</p>
<p>However, in<em> the narrative condition,</em> this information was structured as a story with a plot involving the potential partner experiencing causally connected events over time.</p>
<p><strong>Profile B: Narrative (</strong><strong>a life in motion)</strong></p>
<p>“Hi there, I’m Dan. For as long as I can remember, I’ve been breathing art. My grandfather, may his memory be blessed, believed that music connects people, and at age 7 he gave me a guitar that became an inseparable part of me. After graduating from high school, I flew with my friends on a trip to South America. I remember the incredible landscapes, the amazing local food, and the extreme activities. I must tell you, the deep conversations with the people I met there taught me to appreciate what’s important in life. Toward the end of the trip, I had to decide whether to stay in art or go in a new direction. I won’t leave you in suspense. Today, I’m an economics student working in high-tech. But don’t worry, the guitar is still part of me. If I’m not studying or working, you’ll find me hiking with friends and family, running, or playing music. But mostly, I love ending the day with a glass of wine and a guitar on the balcony. If you find yourself nodding with a smile, I’d love to get to know you.”</p>
<p><strong>Study 2: Photos can tell a story too</strong></p>
<p>In the second study, we used pictorial self-presentations, as pictures have a stronger impact on impression formation in online dating profiles compared with text.<sup>5</sup> Participants viewed a potential partner’s profile containing five photos that we varied in how much they “told a story.”</p>
<p>In the narrative condition, the photos depicted the potential partner across varied situations that unfold like a typical day, from morning through evening. Specifically, the photos captured early activities like exercising or studying, progressed through daily routines, such as cooking, and culminated in evening engagements like socializing or family interactions. Seen together, this chronological presentation of distinct facets of the potential partner’s life and relationships created a coherent “slice of life” that provided a sense of who this person is and what being with them might feel like.</p>
<p>In the non-narrative condition, photos showed the same potential partner in neutral settings, such as in a park or on a street, without that connective thread. </p>
<p><strong>Study 3: Real-life profiles</strong></p>
<p>In the third study, we examined whether a combined self-presentation of written and visual cues, which closely resembles real dating platforms, produces the most potent effects on empathy and interest toward the potential partner. Participants here viewed one of four profiles that included either (a) a narrative or non-narrative written self-presentation and (b) five self-descriptive photos that were either narratively interconnected or not.</p>
<p><strong>What did we find? </strong></p>
<p>Narrative self-presentations in online dating profiles intensified empathy for the profiled individual. This heightened empathy, in turn, predicted greater romantic interest in the potential partner.</p>
<p><strong>Why narratives work</strong></p>
<p>By humanizing profiles and encouraging genuine emotional engagement, storytelling actively counters the objectifying nature of online dating platforms. Such self-presentation motivates date seekers to view the profiled individuals as fellow human beings rather than mere commodities. In this way, storytelling creates a rewarding emotional experience that transcends detached evaluation or simple fact-gathering, paving the way for more meaningful initial interactions. Overall, narrative presentations foster a sense of connection in an otherwise detached medium of online dating. And it does so even before a single message is sent.</p>
<p><strong>Try this quick swap: </strong><strong>Turn labels into lived examples</strong></p>
<p>Instead of: “Funny.”</p>
<p>Try: “I laugh at my own jokes first. It’s part of my charm.”</p>
<p>Instead of: “Outdoorsy.”</p>
<p>Try: “Most weekends I disappear into a trail, come back sunburned, and swear I’ll bring more water next time.”</p>
<p>Instead of: “Independent.”</p>
<p>Try: “I love my alone time. I also love choosing to share it with someone.”</p>
<p>Look at your last text or bio. Did you write a list, or did you tell a scene? Try changing one bullet point to a story today and see how it feels.</p>
<p><strong>The takeaway</strong></p>
<p>We are fascinated by stories, yet we keep writing our dating profiles like shopping lists. Eventually, it’s not the height or the ambition that makes someone fall for you. It’s your entire story. And the right person can’t feel that from bullet points.</p>
<p><strong>References:</strong></p>
<ol>
<li>Birnbaum, G. E., & Zholtack, K. (2026). Once upon a swipe: The impact of storytelling on dating profile appeal. <em>Psychology of Popular Media</em>.<br>
<a href="https://doi.org/10.1037/ppm0000661">https://doi.org/10.1037/ppm0000661</a></li>
<li>Green, M. C., & Brock, T. C. (2000). The role of transportation in the persuasiveness of public narratives. <em>Journal of Personality and Social Psychology, 79</em>(5), 701–721. <a href="https://doi.org/10.1037/0022-3514.79.5.701">https://doi.org/10.1037/0022-3514.79.5.701</a></li>
<li>Junior, J. R. D. O., Limongi, R., Lim, W. M., Eastman, J. K., & Kumar, S. (2023). A story to sell: The influence of storytelling on consumers’ purchasing behavior. <em>Psychology & Marketing, 40</em>(2), 239–261. <a href="https://doi.org/10.1002/mar.21758">https://doi.org/10.1002/mar.21758</a></li>
<li>Van Laer, T., Feiereisen, S., & Visconti, L. M. (2019). Storytelling in the digital era: A meta-analysis of relevant moderators of the narrative transportation effect. <em>Journal of Business Research, 96</em>, 135–146. <a href="https://doi.org/10.1016/j.jbusres.2018.10.053">https://doi.org/10.1016/j.jbusres.2018.10.053</a></li>
<li>Van der Zanden, T., Mos, M. B., Schouten, A. P., & Krahmer, E. J. (2022). What people look at in multimodal online dating profiles: How pictorial and textual cues affect impression formation. <em>Communication Research, 49</em>(6), 863–890. <a href="https://doi.org/10.1177/0093650221995316">https://doi.org/10.1177/0093650221995316</a></li>
</ol>
<div class="jeg_video_container jeg_video_content"></div>
<p> </p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/news-chatbots-that-present-multiple-viewpoints-tend-to-earn-the-trust-of-conspiracy-believers/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">News chatbots that present multiple viewpoints tend to earn the trust of conspiracy believers</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 20th 2026, 14:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>A recent study published in the journal <em><a href="https://doi.org/10.1016/j.chb.2026.108920" target="_blank" rel="noopener">Computers in Human Behavior</a></em> suggests that automated news chatbots programmed to deliver balanced viewpoints can earn the trust of people with varying ideological backgrounds. The research provides evidence that individuals who hold strong conspiracy beliefs tend to respond well to these chatbots, viewing them as useful tools for reading diverse news. These findings point to new ways technology might help pierce information bubbles and reduce societal division by exposing people to multiple perspectives.</p>
<p>In recent years, generative artificial intelligence has transformed how people interact with information online. Generative artificial intelligence refers to computer systems that can process massive amounts of text and generate human-like responses. News chatbots rely on similar technology to act as automated conversational agents. These programs allow users to browse topics, providing real-time text summaries of news articles in a chat window.</p>
<p>The authors behind the new study wanted to see if these chatbots could help solve a growing problem in the modern media landscape. People often engage in selective exposure, meaning they only click on news that matches their existing beliefs. Over time, this habit creates an echo chamber, which tends to increase political and social polarization.</p>
<p>When people are only exposed to one side of a story, they often become defensive or dismissive of alternative viewpoints. The scientists wanted to know if a neutral, automated chatbot could encourage people to step outside their comfort zones. They suspected that people might view a machine as more objective than a human journalist.</p>
<p>“People who believe in conspiracy theories tend to distrust mainstream media, seeing it as biased or agenda-driven,” said study author <a href="https://www.linkedin.com/in/sdubey03/" target="_blank" rel="noopener">Shreya Dubey</a> (<a href="https://www.threads.com/@sdubey03" target="_blank" rel="noopener">@sdubey03</a>), a postdoctoral researcher in the Amsterdam School of Communication Research at the University of Amsterdam.</p>
<p>“We wanted to test whether a chatbot, which may be perceived as more neutral than a traditional news outlet, might be better received by this group. We designed a chatbot that presented both mainstream and alternative news articles, then looked at whether conspiracy believers were more willing to trust and use it compared to people who don’t hold such beliefs.”</p>
<p>Specifically, the scientists developed a custom chatbot named Infobot. This program was designed to present users with eight different news headlines about climate change.</p>
<p>Four of the headlines represented mainstream scientific perspectives supporting climate action. The other four headlines represented alternative viewpoints, including arguments against climate action and narratives that framed climate change as a hoax. Users could scroll through the headlines and click on any article to read a brief summary generated by the chatbot.</p>
<p>After reading a summary, that article disappeared, prompting the user to choose another. The software tracked which articles the users selected and how much time they spent reading them. In the first study, the scientists recruited a sample of 177 adult residents of the United States.</p>
<p>They split these participants into two groups based on their responses to a questionnaire about general conspiracy theories. The final sample included 93 individuals with low generic conspiracy beliefs and 84 individuals with high generic conspiracy beliefs. Participants were instructed to interact with Infobot and read at least four article summaries.</p>
<p>Afterward, they answered survey questions rating the chatbot on its ease of use, perceived usefulness, and potential risks. They also rated their overall trust in the program, their general attitude toward it, and their intention to use such a tool in the future. The data showed that participants who found the chatbot useful and trustworthy tended to have a positive attitude toward it.</p>
<p>This positive attitude directly predicted their intention to use news chatbots again. Unexpectedly, the scientists found that people with high generic conspiracy beliefs trusted the chatbot more than those with low conspiracy beliefs. The high-belief group also reported a more positive attitude and a greater intention to use the program in the future.</p>
<p>Both groups read a similar number of mainstream and alternative articles. However, the software revealed that individuals with higher conspiracy beliefs spent significantly less time actually reading the mainstream summaries compared to the alternative ones. The researchers noticed a potential flaw in their first study.</p>
<p>They had grouped people based on their belief in general conspiracies, rather than their specific beliefs about climate change. In fact, the two groups did not significantly differ in their actual belief in human-caused global warming. To fix this, the scientists conducted a second study.</p>
<p>For the second study, the researchers recruited 58 participants. This time, they specifically screened for beliefs about climate change. The sample included 35 individuals with low climate change conspiracy beliefs and 23 individuals with high climate change conspiracy beliefs.</p>
<p>The procedure was nearly identical to the first experiment. However, participants had to enter a special code from the chatbot to prove they had paid attention to the summaries. The second study replicated the findings of the first.</p>
<p>Once again, trust and perceived usefulness predicted a positive attitude toward the chatbot. Participants with high climate change conspiracy beliefs trusted the chatbot more and showed a greater intention to use it than those with low conspiracy beliefs. The scientists noted that both groups generally responded positively to the program, but the high-belief group was consistently more enthusiastic.</p>
<p>The researchers suspect this happens because individuals with strong conspiracy beliefs often feel that mainstream media is biased against them. Because the chatbot presented their alternative views on equal footing with mainstream science, they likely viewed the machine as a fair and unbiased source of information.</p>
<p>“Most of us, regardless of our beliefs, tend to think we’ve formed our opinions objectively and from good information,” Dubey told PsyPost. “Our findings suggest that a chatbot presenting multiple perspectives feels refreshingly balanced to people across the board, including those who distrust mainstream media.”</p>
<p>“But this raises an uncomfortable question: is balance always desirable? Climate change is not genuinely contested among scientists, yet our chatbot presented mainstream and alternative views side by side. While this approach made the tool widely accepted, it also risks creating a false equivalence. That is, giving fringe or misleading viewpoints the same weight as scientific consensus. The very feature that made our chatbot appealing could, if applied carelessly, end up legitimising misinformation.”</p>
<p>“So the real takeaway is a tension worth sitting with: tools that feel balanced and neutral may be our best shot at reaching people across ideological divides, but ‘balance’ on issues like climate change is not a neutral act in itself,” Dubey said.</p>
<p>While the findings offer hope for reducing polarization, the researchers noted several limitations. First, the studies only compared people at the extreme ends of the conspiracy belief spectrum. Individuals with moderate beliefs were excluded from the main analysis, which means the results might not represent the entire population. Second, the participants only interacted with the chatbot one time in a controlled survey environment.</p>
<p>It is unclear if their positive attitudes would persist after repeated use over weeks or months. It also remains to be seen if people would voluntarily choose to use a balanced news chatbot in the real world when they have access to highly personalized social media feeds.</p>
<p>Future research should investigate exactly which features of the chatbot make it appealing to different groups. Scientists could also explore whether giving users some control over the ratio of mainstream to alternative news might increase their willingness to engage with opposing viewpoints.</p>
<p>The study, “<a href="https://doi.org/10.1016/j.chb.2026.108920" target="_blank" rel="noopener">Investigating perceived trust and utility of balanced news chatbots among individuals with varying conspiracy beliefs</a>,” was authored by Shreya Dubey, Paul E. Ketelaar, Tilman Dingler, Hannah K. Peetz, and Hein T. van Schie.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<table style="font:13px Helvetica, sans-serif; border-radius:4px; -moz-border-radius:4px; -webkit-border-radius:4px; background-color:#fff; padding:8px; margin-bottom:6px; border:1px solid #adadad;" width="100%">
<tbody>
<tr>
<td><a href="https://www.psypost.org/new-study-finds-link-between-receptivity-to-corporate-bullshit-and-weaker-leadership-skills/" style="font-family:Helvetica, sans-serif; letter-spacing:-1px;margin:0;padding:0 0 2px;font-weight: bold;font-size: 19px;line-height: 20px;color:#222;">New study finds link between receptivity to “corporate bullshit” and weaker leadership skills</a>
<div style="font-family:Helvetica, sans-serif; text-align:left;color:#999;font-size:11px;font-weight:bold;line-height:15px;">Mar 20th 2026, 12:00</div>
<div style="font-family:Helvetica, sans-serif; color:#494949;text-align:justify;font-size:13px;">
<p><p>People who are more impressed by buzzword-heavy “corporate speak” tend to perform worse on measures of workplace leadership and decision-making, according to a new study published in <a href="https://doi.org/10.1016/j.paid.2026.113699"><em>Personality & Individual Differences</em></a>.</p>
<p>Many workplaces rely heavily on jargon-filled communication, phrases such as “growth hacking,” or drilling down one more click.” Although such language may sound sophisticated, researchers have increasingly questioned whether it actually improves communication or instead obscures meaning.</p>
<p>Researchers studying “bullshit receptivity” define it as the tendency to evaluate vague or misleading statements as profound, insightful, or informative even when they contain little substance. Previous research has linked receptivity to various kinds of misleading or pseudo-profound language with weaker analytic thinking and poorer reasoning.</p>
<p><a href="https://shanelittrell.com/">Shane Littrell</a>, a postdoctoral research associate at Cornell University, set out to examine this phenomenon specifically in corporate environments. Littrell explained that the idea for the research grew from his own professional experience: “I used to work in a corporate environment and hated it so much that I eventually switched careers.”</p>
<p>As he described it, “One of the more frustrating aspects was the confusing way the ‘higher ups’ would talk to everyone.” For example, he recalled that “one of my bosses loved to use words like ‘synergizing,’ ‘derivation,’ and ‘optimal flow-through’ in ways that didn’t make coherent sense. He also loved to say that we would ‘download on’ a topic rather than ‘discuss’ or ‘talk about’ it.”</p>
<p>“It was so aggravating because I couldn’t understand why he didn’t just talk like a normal person,” Littrell said.</p>
<p>“The way executives often spoke probably sounded impressive (at least to them) but made actual communication much more difficult for everyone else.” Because so many people encounter this type of messaging in organizations, he argued that the topic deserves systematic scientific investigation. As he put it, “hundreds of millions of people have to deal with these types of organizations every day because they either work for them or consume their products or both.”</p>
<p>The research included four studies with a combined sample of 1,018 working adults from the United States and Canada, recruited through online research platforms. Participants evaluated a series of corporate-style statements. Some of these statements were genuine quotes from business leaders, while others were generated using an algorithm that assembled corporate buzzwords into sentences that sounded plausible but were essentially meaningless. Participants rated how much “business savvy” was expressed by each statement on a 5-point scale ranging from “No business savvy at all” to “A great deal of business savvy.”</p>
<p>Responses from the first study were used to create the Corporate Bullshit Receptivity Scale (CBSR), a new measure designed to capture how strongly people perceive jargon-heavy corporate statements as “business savvy.” In later studies, Littrell validated the scale by comparing it with a variety of other measures.</p>
<p>Participants completed tests of analytic thinking and reasoning, including the Cognitive Reflection Test and measures of fluid intelligence, as well as scales assessing open-minded thinking and receptivity to other forms of misleading language. They also completed measures relevant to workplace functioning, such as ratings of leadership qualities, job satisfaction, trust in supervisors, responses to corporate mission statements, and situational judgment tasks designed to measure decision-making in workplace scenarios.</p>
<p>Littrell emphasized that receptivity to misleading language can depend heavily on context. As he explained, “One of the most important issues that this work highlights is that ‘bullshit receptivity’ is highly contextual, so it’s inappropriate for researchers to use any scale in a ‘one size fits all’ kind of way. This makes sense when you consider how bullshit receptivity functions in the real world.</p>
<p>“For example, a business professional might think that New Age claims about ‘transcendent consciousness’ and ‘vibrational energies’ are really stupid (i.e., they have low receptivity to pseudo-profound BS). But, when they get to work the next day, they fall for all kinds of misleading claims that are wrapped up in buzzwords like ‘blue-sky-thinking’ and ‘customer differentiated value proposition’ (i.e., they have high receptivity to corporate BS).”</p>
<p>“The bottom line is that almost anyone can fall for bullshit when it’s packaged to appeal to their specific expertise, interests, or biases.”</p>
<p>Across the studies, Littrell found that individuals differed significantly in how impressed they were by corporate buzzword statements. Those with higher corporate-bullshit receptivity scores were more likely to view jargon-heavy statements as insightful or indicative of business expertise. They were also more likely to engage in persuasive “bullshitting” themselves, using exaggerated or misleading language to impress others.</p>
<p>At the same time, higher receptivity was associated with lower scores on measures of analytic thinking and fluid intelligence, suggesting that individuals who were more impressed by corporate jargon were also less likely to critically evaluate information.</p>
<p>The research also revealed important workplace implications. Individuals with higher corporate-bullshit receptivity were more likely to find corporate mission statements inspiring and to perceive their supervisors as charismatic or transformational leaders. However, these same individuals performed worse on situational judgment tasks designed to measure workplace leadership and decision-making ability. Corporate-bullshit receptivity was the strongest predictor of poorer performance on these decision-making tasks, even after accounting for other variables.</p>
<p>Littrell explained why this matters for organizations. “Confusing, buzzword-heavy ‘corporate speak’ isn’t merely annoying or frustrating, it can actually be harmful. Some workplace language that might sound smart and strategic, at least on a superficial level, is what I call ‘functionally misleading’ (it can mislead the audience regardless of the speaker’s intentions).”</p>
<p>Because people differ in how persuasive they find such language, these differences can have real consequences. “People who are more impressed – that is, they have higher corporate-bullshit receptivity – perform worse on measures of workplace leadership and decision-making. So, ‘corporate bullshit’ can backfire because, regardless of what the speaker intended, it can distort whether the audience clearly and accurately understands the goals, feedback, or decisions that are communicated to them. This not only can impair employee performance and career advancement but potentially lead to reputational and financial costs for organizations as well.”</p>
<p>The author cautions that although the Corporate Bullshit Receptivity Scale is a promising research tool, it is not yet ready for high-stakes uses such as employee screening or hiring decisions. Additional work is also needed to test the scale in different cultures and languages where the concept of “bullshit” may be interpreted differently.</p>
<p>“The next steps will focus on stronger tests of real-world validation like collecting data in specific companies and comparing employee receptivity scores to objective company metrics (e.g., sales goals, performance ratings from direct supervisors),” Littrell told PsyPost.</p>
<p>“We also need a deeper understanding of how context influences receptivity because things like speaker authority, status, and delivery style may amplify (or nullify) the effectiveness of bullshit and bullshitting in workplace settings.”</p>
<p>Overall, the findings suggest that impressive-sounding corporate jargon may do more than irritate employees—it might also influence how effectively people evaluate information and make decisions at work.</p>
<p>As Littrell advised, people should remain cautious when encountering impressive-sounding organizational language. “Depending on the context, almost any of us can be easily fooled by impressive-sounding language. It’s kind of a cognitive trap. So, if a message feels ‘smart’ or otherwise impressive but you can’t explain what evidence would support it – or even paraphrase it into a straightforward, concrete claim – you’re probably being influenced by the message’s packaging rather than its actual content.”</p>
<p>“So, whether you’re an employee or a consumer, if you come across any type of organizational messaging (e.g., from a company’s business leaders, public reports, mission statements, advertising, etc.), ask yourself, ‘What is the concrete claim here?’ ‘Does it make sense?’ and ‘Does the evidence actually support it?’ because impressive-sounding buzzwords and jargon are often red flags that you’re being misled.”</p>
<p>Littrell also writes about these topics on his Substack, <a href="https://bullshitology.substack.com/"><em>Bullshitology</em></a>.</p>
<p>The research “<a href="https://doi.org/10.1016/j.paid.2026.113699">The Corporate Bullshit Receptivity Scale: Development, validation, and associations with workplace outcomes</a>” was authored by Shane Littrell.</p></p>
</div>
<div style="font-family:Helvetica, sans-serif; font-size:13px; text-align: center; color: #666666; padding:4px; margin-bottom:2px;"></div>
</td>
</tr>
</tbody>
</table>
<p><strong>Forwarded by:<br />
Michael Reeder LCPC<br />
Baltimore, MD</strong></p>
<p><strong>This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers. </strong></p>
<p> </p>
<p><s><small><a href="#" style="color:#ffffff;"><a href='https://blogtrottr.com/unsubscribe/565/DY9DKf'>unsubscribe from this feed</a></a></small></s></p>