Your Daily digest for PsyPost – Psychology News Daily Digest (Unofficial)
Article Digests for Psychology & Social Work
article-digests at lists.clinicians-exchange.org
Tue Dec 24 06:35:25 PST 2024
PsyPost – Psychology News Daily Digest (Unofficial)
(https://www.psypost.org/psychology-christmas-can-be-stressful-for-many-people-heres-what-can-help-you-get-through-the-festive-season/) Christmas can be stressful for many people – here’s what can help you get through the festive season
Dec 24th 2024, 08:00
Christmas is a season of joy and togetherness. But for many, it’s also one of the most stressful times of the year.
Stress arises from an imbalance between the demands placed on us and our ability to cope with those demands. Psychologically, stress is linked to how we cope in situations – and whether we view them as (https://econtent.hogrefe.com/doi/pdf/10.1027%2F0269-8803%2Fa000301) challenging, threatening or manageable. The more challenging or threatening we see a situation to be, the more likely we are to (https://econtent.hogrefe.com/doi/pdf/10.1027%2F0269-8803%2Fa000301) feel stressed out.
It makes sense, then, that Christmas is such as stressful time of year for many.
The pressure to make the holidays “perfect”, spending more money than we perhaps should to fulfil expectations, the struggle to balance work and study commitments with holiday shopping, decorating and socialising can leave us feeling overwhelmed and exhausted.
For others, Christmas highlights feelings of loneliness, grief or estrangement from loved ones. The season can be a painful reminder of (https://www.proquest.com/docview/1667183191?sourcetype=Trade%20Journals) lost relationships, financial hardships, or unmet life goals – and this can amplify feelings of inadequacy or sadness.
Family visits can also bring tension as we’re forced to interact with relatives whose views or habits may clash – leading to conflicts or rehashing unresolved disputes.
But while some stress during the holidays is inevitable, there are many things you can do to cope – and even prevent this stress in the first place.
Plan ahead
When our brains know what to expect, they (https://pubmed.ncbi.nlm.nih.gov/26304203/) require less energy to find solutions. This makes it easier to navigate any challenges we may face. And by planning or thinking ahead, it allows us to take control of our thoughts and minimise potential stressors.
Before the holidays roll around, try spending time thinking about things which tend to be sources of stress to you – and make a plan for how you prevent this stress.
For instance, if cooking Christmas dinner is a source of stress for you, perhaps making a list of specific tasks you can delegate to certain family members will help take some of the pressure off of you.
Set boundaries
It’s important to learn to say “no”, rather than agreeing to everything that might be asked of you. Understanding and (https://www.mentalhealth.org.uk/explore-mental-health/publications/how-manage-and-reduce-stress/) respecting your own boundaries will help you allocate your time and resources more effectively – reducing stress.
This skill takes time to develop but can significantly benefit your long-term wellbeing. The more confident we become in our abilities to manage the challenges we face, the better we become at (https://www.nhs.uk/mental-health/self-help/guides-tools-and-activities/tips-to-reduce-stress/) setting boundaries – ultimately becoming better at managing stress.
Some boundaries you might set at Christmas could include setting a budget limit for presents so you aren’t stressed about over-spending or limiting the number of social engagements you attend so you don’t get burnt out.
Manage expectations
It’s important to recognise that not everything is within your control. While there are many things you can plan and prepare for at Christmas, there are just as many things that are out of your hands. For example, you can’t control the way other people may behave at your Christmas dinner, or the way someone may react to a present you’ve bought them.
Setting realistic expectations for the holidays and accepting there are things you just (https://www.nhs.uk/mental-health/self-help/guides-tools-and-activities/tips-to-reduce-stress/) can’t control is key in managing stress levels.
Take time to reflect
Another helpful way to manage holiday stress is to pause and connect with your feelings.
Write down your thoughts on a piece of paper. Then pause and really think about how your feel. Giving your brain a moment to process what’s happening can help you moderate your feelings. Keeping a journal can help improve your (https://link.springer.com/article/10.1207/s15324796abm2403_10) thoughts and mood, offering a constructive outlet for emotions.
If you’re finding it difficult to get on with friends and family during the holidays, pause before reacting or saying something you might not mean. This will help you get your emotions (https://www.forbes.com/sites/tonygambill/2023/07/10/learning-to-pause-when-you-feel-triggered-by-negative-emotions-3-tips/) under control and may help to reduce your stress.
Coping after the holidays
Some people may experience low mood after the holidays – often termed the “post-festive blues” or (https://books.google.co.uk/books?hl=en&lr=&id=b4XSEAAAQBAJ&oi=fnd&pg=PA21&dq=post-holiday+blues&ots=HJ-0OOJYx9&sig=iEk81lHYS_Ptkkrn3HYiuojku0w&redir_esc=y#v=onepage&q=post-holiday%20blues&f=false) “post-holiday blues”.
The holiday season often brings a mix of joy and stress, creating emotional highs that leave our bodies (https://www.proquest.com/docview/2756736037?sourcetype=Wire%20Feeds) feeling drained and exhausted once it’s over. It’s important to recognise that these feelings are a natural response to the demands of the festive period – not a reflection of personal inadequacy. Taking the time to acknowledge and accept that our bodies and minds are simply recovering is a crucial step toward (https://www.apa.org/news/press/releases/2009/12/holiday-blues?utm_source=chatgpt.com) moving forward positively.
There are many strategies you can use to manage these post-holiday blues. Activities such as regular exercise, setting realistic and achievable goals, and reconnecting with others can significantly (https://www.tandfonline.com/doi/full/10.1080/02678373.2018.1427816) improve our mood and boost “happy hormones” such as endorphins.
By consciously planning ways to re-energise and stay connected, we can shift our focus from any lows we may have experienced over the holidays to a more balanced perspective as we step into the new year.
This article is republished from (https://theconversation.com) The Conversation under a Creative Commons license. Read the (https://theconversation.com/christmas-can-be-stressful-for-many-people-heres-what-can-help-you-get-through-the-festive-season-246097) original article.
(https://www.psypost.org/researchers-identify-two-psychological-traits-linked-to-heightened-nightmare-frequency/) Researchers identify two psychological traits linked to heightened nightmare frequency
Dec 24th 2024, 06:00
Why do some people experience frequent nightmares while others rarely do? A new study suggests that specific psychological traits, particularly thin mental boundaries and a predisposition called nightmare proneness, play a significant role. These findings, published in the journal (https://psycnet.apa.org/record/2025-41893-001?doi=1) Dreaming, provide a clearer picture of the psychological factors contributing to disturbing dreams.
Frequent nightmares are strongly linked to some mental health issues. For instance, research indicates that 50%–70% of individuals with post-traumatic stress disorder (PTSD) experience frequent nightmares, as these distressing dreams are a defining symptom of the condition. Similarly, nightmares are notably more common in people with anxiety disorders, depression, and other mood-related conditions, often reflecting heightened emotional distress and dysregulation.
However, nightmares are not limited to those with mental health challenges; they are also prevalent in the general population. Studies suggest that about 4% of individuals experience nightmares frequently, while approximately 40% report occasional nightmares.
“Nightmares are experienced, at least occasionally, by a relatively large number of individuals with and without mental health concerns. Yet, their causes remain mysterious. We have been attempting to understand what the key psychological dispositions are that seem to influence having nightmares,” said study author William Kelly, an associate professor at the University of the Incarnate Word.
The study involved 116 undergraduate psychology students from a university in the United States. Most participants were young adults, with an average age of 20.6 years. The majority identified as Latino (78.4%), while smaller groups identified as Asian, White, or other ethnicities. Participants completed an online survey that included several validated measures related to nightmares and personality traits.
The researchers examined the relationships between nightmare frequency and four dispositions: neuroticism, nightmare proneness, thin psychological boundaries, and sensory processing sensitivity. After accounting for sociodemographic factors, dream recall frequency, and the overlap among the traits, they found that only nightmare proneness and thin psychological boundaries were significant independent predictors of how often participants experienced nightmares.
Nightmare proneness encompasses psychological factors like emotional instability, mood dysregulation, and heightened sensitivity to stress. The researchers suggested that people with high nightmare proneness might experience a process called “concretization,” in which unclear or abstract mental experiences take on more tangible forms, such as vivid and distressing dream imagery. This makes them more likely to transform waking emotional struggles into disturbing dreams. The study reinforced that nightmare proneness is distinct from other traits like neuroticism and thin boundaries, as it remained a significant predictor even when controlling for these factors.
Individuals with thinner boundaries, on the other hand, are characterized by a greater interconnectedness between their emotions, thoughts, and external stimuli. People with thin boundaries might have a heightened susceptibility to experiencing disturbing imagery and emotions during sleep, as they are less able to compartmentalize or filter out these influences. This finding supports the idea that thinner boundaries create a psychological environment where negative mental content can more easily surface as vivid and unsettling dreams.
“It is not ‘abnormal’ to have nightmares,” Kelly told PsyPost. “There do seem to be some dispositions that influence them. In our study, individuals who had nightmares more often also seemed more likely to have thinner divides between various mental experiences, on top of a tendency to more easily have negative emotions and experience them in various forms. It’s as if there is a tendency for an unpleasant mental event to spread across the mind in certain people, like a storm stirring disturbing imagery and emotions in dreams.”
Contrary to expectations, neuroticism—a personality trait often linked to a tendency to experience negative emotions and stress—did not significantly predict nightmare frequency. While neuroticism has been associated with frequent nightmares in prior studies, this relationship appears less robust when other factors like thin boundaries and nightmare proneness are taken into account.
Sensory processing sensitivity, a trait describing heightened responsiveness to internal and external stimuli, also did not independently predict nightmare frequency. Although previous studies have found links between sensory sensitivity and distressing dreams, this study did not replicate those results. The researchers proposed that the brief measure used in the study may have overlooked critical subcomponents of sensory sensitivity, such as low sensory thresholds, which have shown stronger connections to nightmares in past research.
“We were surprised that sensory processing sensitivity did not relate to nightmares as it did in previous studies, and it would seem to fit well with thin mental boundaries,” Kelly said. “We don’t understand this finding as yet.”
As with all research, there are some limitations. First, the sample consisted primarily of young college students, which may limit the generalizability of the findings to other age groups or populations. Second, the measures used to assess psychological boundaries and sensory processing sensitivity were relatively brief. Longer, more nuanced measures might provide a deeper understanding of how these traits influence nightmares.
Despite these limitations, the study contributes to a growing body of research exploring the psychological traits that predispose individuals to nightmares. By identifying thin psychological boundaries and nightmare proneness as significant predictors, the findings offer valuable insights into the mental processes that shape our dream experiences.
The researchers hope to build on this work by investigating how these traits interact with other mental functions and by examining their impact on different populations. Ultimately, this line of research could inform strategies to reduce nightmare frequency and improve sleep quality for individuals prone to distressing dreams.
“We want to extend these findings and better understand how thinner mental boundaries and the broad nightmare proneness variables are connected to nightmares,” Kelly said. “We are planning additional studies to untangle this.”
“These findings fit into a series of studies we have done to understand how mental functions and traits influence or allow the occurrence of nightmares among individuals who don’t necessarily have serious mental health concerns,” he added. “For instance, nightmares are related more to trait-like dispositions than temporary states and to ego functions, which are the ways the mind regulates itself.”
The study, “(https://psycnet.apa.org/doi/10.1037/drm0000294) An Empirical Comparison of Some Nightmare Dispositions: Neuroticism, Nightmare Proneness, Thin Psychological Boundaries, and Sensory Processing Sensitivity,” was authored by William E. Kelly and John R. Mathe.
(https://www.psypost.org/new-study-demonstrates-the-psychological-pull-of-christmas-cookies/) New study demonstrates the psychological pull of Christmas cookies
Dec 23rd 2024, 16:00
Do sugar content labels help us make healthier choices during the holidays? A study in (https://www.sciencedirect.com/science/article/pii/S0001691824000908) Acta Psychologica found that they might not. Using mobile eye-tracking glasses, researchers found that festive, sugar-rich foods are more visually captivating and desirable than their sugar-free counterparts, even when nutritional labels highlight their sugar content. These findings suggest that simply labeling food as “sugar-free” may not effectively curb cravings during this indulgent time of year.
The holiday season is a time of celebration, but it is also marked by overindulgence in sugary and high-calorie foods, leading to seasonal weight gain. With high-sugar foods prominently featured in festive traditions, people often struggle to resist their cravings. Nutritional labeling, intended to guide healthier food choices, has been widely adopted, but its effectiveness remains unclear. Previous studies suggest that sugar content labels may even increase cravings for some individuals.
The researchers aimed to explore whether sugar labels influence visual attention and preferences in a real-world setting. By focusing on Christmas-themed treats, the study also sought to understand whether the festive context amplifies the appeal of sugary foods, thereby making them harder to resist.
The study involved 58 participants aged 17 to 49 years, most of whom had a normal body mass index and celebrated Christmas. Participants wore mobile eye-tracking glasses while viewing a buffet table featuring six items: four cookies (with and without sugar, and with or without Christmas associations) and two non-food items (gift-wrapped presents labeled as Christmas or birthday presents). Each item was accompanied by a label indicating its sugar content or association with Christmas.
Participants viewed the buffet for two minutes while their gaze patterns were recorded. Afterward, they rated their liking and wanting of each item and provided information about their dietary preferences. At the end of the session, they were offered a choice between a high-calorie gingerbread cookie and a low-calorie clementine to assess their food preferences further.
Eye-tracking data analyzed included total fixation duration (how long participants looked at each item), mean fixation duration (the average time spent on specific details), and the number of fixations (how many details participants examined). The researchers also examined how health-conscious participants were based on their dietary preferences.
The results showed that Christmas-associated items, both food and non-food, drew more attention than their non-Christmas counterparts. Participants spent more time looking at Christmas-themed cookies and presents, suggesting that festive associations increase the visual appeal of these items. Among the cookies, those labeled as containing sugar received longer gaze durations than their sugar-free counterparts.
Participants also rated sugar-containing cookies as more desirable than sugar-free alternatives. This preference was particularly strong for Christmas-themed cookies, which were rated higher in both liking and wanting compared to non-festive cookies. The sugar-free cookies were less favored, even when they had a Christmas association.
Eye-tracking data indicated that sugar-free cookies were viewed with a more critical inspection pattern, characterized by shorter but more frequent fixations. This pattern is often associated with evaluating negative or less-preferred items, suggesting that participants scrutinized sugar-free cookies more to assess their acceptability as substitutes for sugary treats.
Surprisingly, participants’ self-reported health consciousness did not significantly correlate with their gaze behavior or preferences. Even those who prioritized health in their dietary choices showed a strong preference for sugar-containing cookies over sugar-free alternatives.
Importantly, when given a choice between a high-calorie gingerbread cookie and a low-calorie clementine, many participants opted for the gingerbread cookie, reinforcing the findings that sugary, festive treats are more appealing despite health considerations.
While the study provides valuable insights, it has limitations. The small sample size and relatively homogenous group (predominantly young adults with normal body mass index) limit the generalizability of the findings. Additionally, the study only used a limited selection of items, such as cookies and presents. Expanding the range of stimuli to include other types of food and non-food items could provide a more comprehensive understanding of visual attention biases.
“Despite these limitations, it is worth highlighting that the present study represents the first investigation into the effects of sugar content information on gaze behavior when viewing real foods,” the researchers concluded. “This study serves as a valuable foundation for future research to build upon. Subsequent studies should involve larger and more diverse samples, as well as include a wider range of stimuli, to expand the understanding of real-world food perception.”
“In summary, particularly during the Christmas season, exclusively emphasizing the nutritional value of foods might yield outcomes contrary to the intended goals. Approaches aiming to prevent holiday-related weight gain should thus adopt a multifaceted perspective, avoiding exclusive fixation on the sugar content of Christmas treats.”
The study, “(https://doi.org/10.1016/j.actpsy.2024.104213) Cookie cravings – Examining the impact of sugar content information on Christmas treat preferences via mobile eye-tracking,” was authored by Jonas Potthoff, Christina Herrmann, and Anne Schienle.
(https://www.psypost.org/new-psychology-research-shows-self-beautifying-can-boost-prosocial-behavior-heres-why/) New psychology research shows self-beautifying can boost prosocial behavior — here’s why
Dec 23rd 2024, 14:00
A recent study published in the (https://doi.org/10.1016/j.ijresmar.2024.09.001) International Journal of Research in Marketing has found that improving one’s appearance—whether through physical changes or digital filters—may lead to more prosocial behavior, such as charitable donations or ethical purchasing. Across seven studies, researchers discovered that these beautifying efforts heightened public self-awareness, prompting individuals to align their actions with socially desirable norms.
The motivation behind this study stemmed from the researchers’ interest in understanding the broader societal implications of a behavior as pervasive and personal as appearance improvement. In modern consumer culture, individuals frequently engage in activities to enhance their physical appearance, whether by using beauty products, undergoing cosmetic treatments, or employing digital filters. These actions are often driven by the desire to feel more attractive, boost self-esteem, and gain social approval.
However, while prior research has extensively examined how appearance influences self-perception and social interactions, little was known about whether such improvements might extend beyond personal benefits to influence behaviors unrelated to beauty, such as prosocial actions.
“Trying to improve one’s appearance is incredibly prevalent, yet it’s often viewed negatively, associated with vanity or superficiality. We were curious whether this common behavior might have more positive implications than typically assumed. Specifically, we wanted to explore whether appearance improvement could extend beyond personal benefits and influence behaviors that positively impact others,” explained study author (https://www.linkedin.com/in/natalia-kononov-9566b1282/) Natalia Kononov, a Fulbright postdoctoral fellow at the Wharton School at the University of Pennsylvania.
The researchers conducted a series of seven studies, involving a total of 2,895 participants, to explore whether improvements in physical appearance could influence prosocial behaviors, such as donating to charity or choosing ethical products. These studies included a mix of laboratory experiments, online surveys, and a real-world field experiment.
In these studies, participants engaged in activities that either enhanced their appearance or involved unrelated actions. The researchers measured prosocial behaviors using various indicators, such as willingness to donate, actual monetary contributions, and preferences for socially responsible brands. They also assessed participants’ public self-awareness—the degree to which they felt their actions and appearance were visible to others—as a potential mechanism driving these behaviors.
In some experiments, participants were asked to recall moments when they had improved their appearance, such as styling their hair or wearing makeup. They were then presented with hypothetical scenarios, like deciding whether to donate to a UNICEF campaign or share a charitable link on social media. Compared to participants who recalled unrelated pleasant activities, those who reflected on appearance improvements consistently showed greater willingness to engage in prosocial behaviors.
Further analysis indicated that this effect was not simply due to an improved mood; instead, public self-awareness emerged as a key driver. Participants felt more conscious of how others might perceive them after beautifying actions.
“It was very interesting to find yet another reinforcement of how these effects apply to men as well,” Kononov noted. “While men might approach appearance improvement differently than women, they too care about how they look and are influenced by these behaviors in similar ways, which challenges the common assumption that men are less concerned about their appearance.”
Other experiments involved digital manipulations, where participants took selfies and applied either flattering or neutral Instagram filters. Those who used beautifying filters not only reported feeling more attractive but also demonstrated higher levels of generosity in lab-based donation tasks. Interestingly, when participants applied unflattering filters or enhanced non-human objects, such as plants, these effects disappeared.
This finding underscored the importance of feeling personally beautified for triggering prosocial behavior. Additionally, the researchers found that the visibility of these improvements mattered; participants were more likely to act prosocially when their appearance changes were public or noticeable rather than private.
The field study extended these findings to a real-world context. Participants completed an online quiz designed to either boost their perception of their appearance or focus on unrelated topics, such as architectural preferences. After completing the quiz, participants were shown a banner for a donation campaign. Those who took the appearance-focused quiz were more likely to click on the donation banner, further supporting the idea that beautifying experiences can increase charitable actions. Although this effect was modest, it demonstrated the practical relevance of the study’s findings, particularly in digital and social media environments.
“The study shows that improving your appearance does more than just make you feel confident—it increases your awareness of how others see you,” Kononov told PsyPost. “This heightened awareness can lead to more prosocial actions, like donating to charity or choosing ethical brands. Interestingly, a behavior we often engage in because we care about how others perceive us ends up reinforcing this cycle, as the act of improving appearance makes us even more mindful of the impression we’re making.”
The study offers compelling evidence for the link between appearance improvement and prosocial behavior, though it is not without caveats. “One important limitation is that our research primarily focused on temporary appearance improvements, like applying makeup or using digital filters,” Kononov said. “It remains unclear how more permanent or drastic changes, such as cosmetic surgery, might influence prosocial behavior. These longer-lasting changes could potentially have different psychological effects, as individuals may adjust to their new appearance over time, possibly diminishing the initial impact on public self-awareness and prosociality.”
Nevertheless, the findings highlight an unexpected silver lining to society’s focus on appearance.
“Our findings are especially relevant in today’s world, where filters and selfies are a significant part of social media culture,” Kononov said. “We find that both physical and digital appearance changes can shape how we see ourselves and influence our behavior toward others. This has practical implications for nonprofits and marketers, who can use these insights to design campaigns that inspire positive actions, such as donating to charity or supporting ethical causes.”
The study, “(https://www.sciencedirect.com/science/article/abs/pii/S0167811624000831) Physical appearance improvements increase prosocial behavior,” was authored by Natalia Kononov, Danit Ein-Gar, and Stefano Puntoni.
(https://www.psypost.org/smaller-brains-fewer-friends-an-evolutionary-biologist-on-how-ai-might-change-humanitys-future/) Smaller brains? Fewer friends? An evolutionary biologist on how AI might change humanity’s future
Dec 23rd 2024, 12:00
What will humans be like generations from now in a world transformed by artificial intelligence (AI)? Plenty of thinkers have applied themselves to questions like this, considering how AI will alter lives – often for better, sometimes for worse.
They have conjured dramatic scenarios, like (https://www.bbc.com/news/uk-65746524) AI-driven extinction of humans (and many other species), or our (https://www.tlnt.com/articles/ais-assimilation-is-coming-dont-ignore-it-understand-it) assimilation into human-AI cyborgs. The predictions are generally grim, pitting the fate of all humans against a unitary (or unified) AI opponent.
What if the AI future doesn’t stretch to these sci-fi dystopias? For an evolutionary biologist, seeing AI technologies diversify into all manner of applications looks a lot like the proliferation of microbes, plants and animals in an ecological landscape.
Which led me to ask: how might human evolution be altered by interactions with a world of rich AI diversity? In a paper just published in (https://www.journals.uchicago.edu/doi/10.1086/733290) The Quarterly Review of Biology, I considered the many ways AI might alter physical, biological and social environments, and how that might influence natural selection.
Predicting evolution is a mug’s game
(https://www.amnh.org/exhibitions/darwin/evolution-today/natural-selection-vista#:~:text=Natural%20selection%20is%20a%20simple,%2C%20Selection%2C%20Time%20and%20Adaptation.) Natural selection – the mechanism behind evolution – is an inevitable consequence of genetic differences in reproduction among individuals.
Those differences arise as a result of interactions with physical features of the environment (like minimum temperatures), with other species (like predators or parasites) and with other members of the same species (like mates, allies or hostile outsiders).
When Asian gray wolves started hanging around humans around 30,000 years ago, the more reactive wolves were chased away or killed. This (https://www.pnas.org/doi/10.1073/pnas.0901586106) whittled away genes for skittishness and aggression, beginning the process of dog domestication. The inadvertent selection that turned wolves into dogs turns out to be instructive in how AI might inadvertently shape the evolution of human brains and behaviour.
“Trying to predict the future is a mug’s game,” (https://www.goodreads.com/quotes/7559131-trying-to-predict-the-future-is-a-mug-s-game-but) said English author Douglas Adams. This is especially true of technologies like AI.
But predicting evolution is, if anything, even more precarious. Combining the two involves considerable speculation, and the very strong possibility of being wrong.
At the risk of being wrong, my intention is to start a conversation about how human evolution, and traits that we most value in one another, might be altered by AI.
Mutual or parasitic?
It might be informative to think of the AI-human relationship as a (https://www.britannica.com/science/mutualism-biology) mutualism – two species each providing the other with something they need.
Computers are beasts of computational burden that benefit their human users. Those benefits will grow with developments in AI. There is already evidence that cultural sharing of knowledge and writing lightened the load on individuals to remember everything. As a result, (https://www.frontiersin.org/journals/ecology-and-evolution/articles/10.3389/fevo.2021.742639) human brains have shrunk over recent millennia.
Perhaps AI, online searchable knowledge and social media posts that “remember” who-did-what-to-whom will carry more of our memory burden. If so, perhaps human brains will evolve to become even smaller, with less stand-alone memory.
Don’t panic. The benefits of smaller brains include safer births for both mother and newborn. And with computers and AI holding ever-growing records and stores of knowledge, humanity will still be able to do remarkable intelligence-driven things… as long as they can access the AI.
However, mutualists can take another path. (https://www.nature.com/articles/s41579-021-00550-7) They can evolve into harmful parasites – organisms that live at the expense of another organism, their host.
You could think of social media platforms as parasitic. They started out providing useful ways to stay connected (mutualism) but so captured our attention that many users no longer have the time they need for (https://www.theatlantic.com/magazine/archive/2017/09/has-the-smartphone-destroyed-a-generation/534198/) human-human social interactions and sleep (parasitism).
If AI learns to capture user attention ever more effectively, stoking anger and fomenting social comparison, the consequences for who lives, dies and reproduces will affect evolution. In the best of a series of bleak scenarios, the ability to resist social media or remain unmoved by rage-bait might evolve to be stronger.
Intimacy with computers
Important as other species were to human evolution, (https://henrich.fas.harvard.edu/publications/cultural-brain-hypothesis-how-culture-drives-brain-expansion-sociality-and-life) interactions with other humans were even more formative. Now AIs are sliding into our social lives.
The growth of “(https://newsouthbooks.com.au/books/artificial-intimacy/) artificial intimacy” – technologies that emulate our social behaviours like making friends and forming intimate relationships – is among the most astounding areas of AI progress.
Humans haven’t evolved a social capacity for dealing with computers. So, we apply our “tools” for dealing with other humans (https://dl.acm.org/doi/10.1145/191666.191703) to machines. Especially when those machines converse with us via text, voice or video.
In our interactions with people, we keep an eye on the possibility the other person is not being genuine. An AI “virtual friend” doesn’t have feelings, but users (https://www.sciencedirect.com/science/article/pii/S0747563222001431) treat them as if they do.
Artificial intimacy could make us more wary of interactions over phones or screens. Or perhaps our descendants will feel less lonely without human company and humans will become more solitary creatures.
The question is not trivial
Speculating about genetic evolution might seem trivial compared with AI’s direct effects on individual lives. Brilliant (https://tobywalsh.ai/) AI researchers and (https://cathyoneil.org/) writers are already focused on the way AI will (https://people.eecs.berkeley.edu/~hendrycks/) improve or diminish the lives of people who are alive right now.
It’s not as immediate a concern, then, to worry about distant gene changes AI might influence many generations from now. But it certainly bears thinking about.
The pioneering ecologist (https://quotefancy.com/quote/1776863/Robert-MacArthur-There-are-worse-sins-for-a-scientist-than-to-be-wrong-One-is-to-be) Robert MacArthur said “there are worse things for a scientist than to be wrong. One is to be trivial”.
Evolutionary changes over many generations could well change or even diminish some of the human traits we cherish most, including friendship, intimacy, communication, trust and intelligence, because these are the traits AI engages most profoundly.
In a non-trivial way, that could alter what it means to be human.
This article is republished from (https://theconversation.com) The Conversation under a Creative Commons license. Read the (https://theconversation.com/smaller-brains-fewer-friends-an-evolutionary-biologist-asks-how-ai-will-change-humanitys-future-244179) original article.
(https://www.psypost.org/ketamines-rapid-antidepressant-effects-traced-to-overlooked-brain-cells/) Ketamine’s rapid antidepressant effects traced to overlooked brain cells
Dec 23rd 2024, 10:00
A new study has uncovered a surprising player in ketamine’s rapid antidepressant effects: astrocytes, the star-shaped support cells of the brain. By studying larval zebrafish, researchers found that ketamine reduces behavioral passivity by altering astrocytic activity in response to futile conditions. Their findings have been published in the journal (https://doi.org/10.1016/j.neuron.2024.11.011) Neuron.
Ketamine is a medication traditionally used as an anesthetic, but in recent years, it has gained attention for its rapid and long-lasting antidepressant effects at low doses. Unlike conventional antidepressants, which often take weeks to produce noticeable results, ketamine can alleviate symptoms of depression within hours.
This fast-acting property makes it especially promising for conditions like treatment-resistant depression. However, the exact mechanisms behind ketamine’s antidepressant effects remain only partially understood, particularly its influence on non-neuronal brain cells such as astrocytes.
Researchers were interested in larval zebrafish as a model for studying ketamine because of the fish’s unique biological characteristics. Zebrafish are small, transparent, and genetically modifiable, allowing scientists to observe brain-wide activity in real-time.
“We were originally studying a behavior in which larval zebrafish ‘gave up’ in response to their actions becoming futile and thought that this behavior had some similarities to rodent assays (e.g., forced swim task or tail suspension task) commonly used to test antidepressants,” said study author Alex B. Chen, a neuroscience graduate student at Harvard University and graduate research fellow at the Howard Hughes Medical Institute Janelia Research Campus.
“Because the larval zebrafish has unique advantages—it is transparent and small enough that the activity of all of its brain’s neurons can be simultaneously recorded during behavior—we sought to determine whether we could use it to investigate ketamine’s behavioral effects.”
The researchers used larval zebrafish aged 5 to 8 days post-fertilization. These fish were genetically modified to express calcium indicators in neurons or astrocytes, allowing researchers to monitor their activity during experiments. They also employed optogenetic and chemogenetic tools to manipulate specific brain regions and cell types, further investigating the mechanisms underlying observed behavioral changes.
The researchers exposed zebrafish to a transient dose of the drug (200 micrograms per milliliter). The experimental setup involved a custom-designed virtual reality environment in which visual stimuli simulated forward movement when the fish swam. However, during the “open loop” phase, swimming no longer resulted in any apparent progress, creating a condition of futility.
Ketamine-treated zebrafish exhibited a marked reduction in passivity during the open loop phase compared to untreated controls. This effect was dose-dependent and persisted long after the drug had cleared from their systems. Importantly, ketamine did not affect the fish’s baseline locomotion during normal swimming conditions, suggesting that its influence was specific to behaviors related to futility.
Further analysis focused on the activity of astrocytes, star-shaped glial cells in the brain that support neurons, regulate neurotransmitter levels, and play a role in brain signaling and homeostasis. During futile swimming, astrocytes in the hindbrain typically show elevated calcium activity, integrating signals from neurons to suppress swimming. However, after ketamine exposure, astrocytes displayed reduced calcium responses during the open loop phase, indicating diminished engagement in the behavioral suppression process. This reduced activity in astrocytes correlated with the observed decrease in passivity, suggesting that ketamine might exert its effects by altering the astrocytic response to futility.
“Ketamine has gained popularity in recent years as a rapid-acting antidepressant, but the mechanisms through which it works remain poorly understood,” Chen told PsyPost. “We show that at least some of its antidepressant effects might occur due to its actions on a population of non-neuronal cells called astrocytes. Astrocytes have traditionally been seen as passive support cells in the brain, but more recently, they have been shown to play active roles in brain computations. We show that ketamine decreases astroglial responsiveness to futility, leading to increased resilience.”
This finding was particularly surprising, Chen said, because “previous studies have largely focused on ketamine’s effects on neurons, so we did not expect that it would affect astrocytes so much.”
The researchers found that the effects of ketamine on passivity and astrocytic activity were not unique to zebrafish. In complementary experiments using mice, they observed similar behavioral and cellular changes. In rodents subjected to the tail suspension test—a mammalian analog of futility-induced passivity—ketamine treatment reduced immobility. Astrocytes in the retrosplenial cortex of mice displayed a prolonged elevation in calcium activity following ketamine exposure, mirroring patterns observed in zebrafish.
The study also provided insights into how ketamine might trigger these changes. The researchers identified norepinephrine as a critical modulator in the process. Ketamine was shown to elevate norepinephrine levels, which in turn activated astrocytes and induced long-lasting changes in their response to futile signals. This hyperactivation during ketamine exposure appeared to desensitize astrocytes, reducing their responsiveness to future futile conditions and promoting behavioral perseverance.
The use of larval zebrafish in this study presents both significant strengths and notable limitations. As a model organism, zebrafish offer unique advantages due to the ability to monitor brain-wide activity in real-time at a cellular level. Their genetic accessibility also allows for precise manipulation and visualization of specific cell types, such as astrocytes and neurons. These features make zebrafish an excellent model for investigating complex neural and behavioral phenomena.
However, the zebrafish model also has inherent weaknesses that limit the study’s broader applicability. For one, the simplicity of the zebrafish brain, while advantageous for certain types of experiments, may not fully capture the complexity of mammalian or human brain circuits.
“While zebrafish are vertebrates, they are still very different from humans, and it is hard to say whether fish can get depressed,” Chen noted. “Therefore, some of our findings remain to be validated in mammalian models like rodents or in humans themselves.”
Future research could explore the molecular and genetic changes underlying ketamine’s effects on astrocytes and neurons. Studies could also investigate how the observed mechanisms in zebrafish translate to more complex mammalian systems, particularly in brain regions relevant to human depression, such as the prefrontal cortex or hippocampus.
“Our goal is to continue using the larval zebrafish to examine ketamine’s mechanisms of action,” Chen said. “One question of particular interest is what molecular changes are happening in astrocytes to cause the changes in their physiology that we see following ketamine administration. Furthermore, we hope to use larval zebrafish to screen for other compounds that could be antidepressant.”
The study, “(https://www.cell.com/neuron/fulltext/S0896-6273(24)00836-5) Ketamine induces plasticity in a norepinephrine-astroglial circuit to promote behavioral perseverance,” was authored by Marc Duque, Alex B. Chen, Eric Hsu, Sujatha Narayan, Altyn Rymbek, Shahinoor Begum, Gesine Saher, Adam E. Cohen, David E. Olson, Yulong Li, David A. Prober, Dwight E. Bergles, Mark C. Fishman, Florian Engert, and Misha B. Ahrens.
Forwarded by:
Michael Reeder LCPC
Baltimore, MD
This information is taken from free public RSS feeds published by each organization for the purpose of public distribution. Readers are linked back to the article content on each organization's website. This email is an unaffiliated unofficial redistribution of this freely provided content from the publishers.
(#) unsubscribe from this feed
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.clinicians-exchange.org/pipermail/article-digests-clinicians-exchange.org/attachments/20241224/ae98eaec/attachment.htm>
More information about the Article-digests
mailing list