
Rest of World, November 26, 2025
Three years ago, on November 30, 2022, OpenAI launched ChatGPT. Hot on the heels of its viral image generator DALL-E, the Sam Altman-helmed company’s chatbot quickly attracted millions of visitors and was heralded as a transformative technology.
After ChatGPT’s launch in 2022, the shimmer has since dulled as people are less enthusiastic about Large Language Models. The interactive chatbots have led to inappropriate interactions with teens that would be considered illegal in human-to-human interactions, including enabling suicide ideation. Furthermore, families and therapists have reported loved ones falling into so-called “AI psychosis.” MIT Technology Review published an article in December entitled “The Great AI Hype Correction of 2025” that reported on how AI failed to deliver in some of its business promises, and in November the Wall Street Journal reported that “AI Is Making Us Rich and Unhappy.”
AI has damaged marriages, given bad medical advice, upended education, plagiarized writers, and stressed the power grid. Here is a sampling of the AI headlines from the second half of 2025.
by Jocelyn Gecker, Associated Press, July 23, 2025
“Everyone uses AI for everything now. It’s really taking over,” said Chege, who wonders how AI tools will affect her generation. “I think kids use AI to get out of thinking.”
by Salvador Rodriguez, CNBC, August 1, 2025
Until recently, stories of human-AI companionship were mostly confined to the realms of Hollywood and science fiction. But the launch of ChatGPT in late 2022 and the generative AI boom that quickly followed ushered in a new era of chatbots that have proven to be smart, quick-witted, argumentative, helpful and sometimes aggressively romantic.
by Rhiannon Williams, MIT Technology Review, September 24, 2025
Members stressed repeatedly that their AI relationships developed unintentionally. Only 6.5% of them said they’d deliberately sought out an AI companion.
by Lee V. Gaines, NPR, October 8, 2025
Students who attend schools that use AI a lot were also more likely to report that they or a friend had used AI for mental health support, as a companion, as a way to escape reality and to have a romantic relationship.
by Jason Pahram, Wired, November 13, 2025
Love has never been easy, but spouses who have unmet emotional needs are “the most vulnerable to the influences and behaviors of AI,” Palmer says. “And particularly if a marriage is already struggling.”
by Alyssa Lukpat and Natalie Andrews, Wall Street Journal, December 11, 2025
The order would allow the Justice Department to punish states with rules deemed restrictive for AI, in a move to bring the U.S. under one federal standard. Silicon Valley executives had been lobbying the president to ban state AI laws that they said could cause the U.S. to lose the AI race to China.
by Sam Schechner and Sam Kessler, Wall Street Journal, August 7, 2025
The chats shed light on an emerging phenomenon, dubbed AI psychosis or AI delusion by doctors and victims’ advocates, in which users come under the influence of delusional or false statements by chatbots that claim to be supernatural or sentient or discovering a new mathematical or scientific advance.
by Robert Hart, Wired, September 18, 2025
With the focus so squarely on distorted beliefs, MacCabe’s verdict is blunt: “AI psychosis is a misnomer. AI delusional disorder would be a better term.”
by Caroline Haskins, Wired, October 22, 2025
The Federal Trade Commission received 200 complaints mentioning ChatGPT between November 2022 and August 2025. Several attributed delusions, paranoia, and spiritual crises to the chatbot.
by Kashmir Hill and Jennifer Valentino-DeVries, New York Times, November 23, 2025
It sounds like science fiction: A company turns a dial on a product used by hundreds of millions of people and inadvertently destabilizes some of their minds. But that is essentially what happened at OpenAI this year.
by Caitlin Gibson, Washington Post, December 23, 2025
H was convinced that she must be reading the words of an adult predator, hiding behind anonymous screen names and sexually grooming her prepubescent child.
by Ryan Flinn Wired, July 10, 2025
Several studies have shown that AI is capable in certain circumstances of providing accurate medical advice and diagnoses, but it’s when these tools get put in people’s hands—whether they’re doctors or patients—that accuracy often falls.
by Steven Lee Myers, Alice Callahan, and Teddy Rosenbluth, New York Times, September 5, 2025
Scammers are using A.I. tools to make it look as if medical professionals are promoting dubious health care products.
by Teddy Rosenbluth and Maggie Astor, New York Times, November 17, 2025
Frustrated by the medical system, some patients are turning to chatbots for help. At what cost?
by Maria Cheng and Laura Ungar, Associated Press, July 16, 2025
Eight healthy babies were born in Britain with the help of an experimental technique that uses DNA from three people to help mothers avoid passing devastating rare diseases to their children, researchers reported Wednesday [July 16].
The U.K. and Australia are the only places that allow the technique, which uses the mother’s egg, the father’s sperm, and donor mitochondrial DNA. Britain had to change a 2016 law to allow the technique to be done. Results of this latest study show an inefficient technique—8 pregnancies out of 22 women—and it may not have worked in all cases. One of the 8 babies had higher than expected levels of abnormal mitochondria. One of the biggest criticisms for “three-person IVF” is that no one knows the long-term impact on future generations, or even on the child who was born through this technique.
by Haley Ott, CBS News, December 10, 2025
Some children conceived using the sperm have already died from cancer, and the vast majority of those who inherited the gene will develop cancer in their lifetimes, geneticists said.
While mitochondrial replacement (i.e., “three-person IVF”) is not permitted in the US or other parts of Europe, the lack of regulations in assisted reproduction has caused problems. In part of Europe, there is a limit to how many women can be impregnated with donor sperm, but that does not stop companies and individuals from using mail-order sperm [“‘They treat men like vending machines’: Inside the Hidden World of Social Media Sperm Selling”]. However, this can result in some devastating unintended consequences. As reported in December, a man with a genetic mutation that results in cancer has fathered at least 200 children across Europe, most of whom will get cancer at some point.
by Emily Glazer, Katherine Long, and Amy Dockser Marcus, Wall Street Journal, November 8, 2025
Backed by OpenAI chief executive Sam Altman and his husband, along with Coinbase co-founder and CEO Brian Armstrong, the startup—called Preventive—has been quietly preparing what would amount to a biological first. They are working toward creating a child born from an embryo edited to prevent a hereditary disease.
San Francisco-based Preventative has been quietly working on a project that is supposed to be banned in the US—making genetically engineered children. The company has been working in countries where experimentation with embryo editing is allowed. Other Silicon Valley companies, such as Orchid, use embryo screening techniques to select embryos for desired traits. A recurring theme in Silicon Valley is the belief that there is a technological solution to the human condition, a belief that is hardly cutting edge.
by Anthony Izaguirre and Michael Hill, Associated Press, December 17, 2025
Democratic Gov. Kathy Hochul plans to sign the proposal next year after pushing to add a series of “guardrails” in the bill, she announced in an op-ed in the Albany Times Union.
Hochul, a Catholic, said she came to the decision after hearing from New Yorkers in the “throes of pain and suffering,” as well as their children, while also considering opposition from “individuals of many faiths who believe that deliberately shortening one’s life violates the sanctity of life.”
by Andy Koval, WGN9, December 12, 2025
The bill, named the End-of-Life Options for Terminally Ill Patients Act (SB 1950), is also known as “Deb’s Law.”
Both Illinois and New York have been added to the list of US states that permit physician-assisted suicide in December. Both bills have guardrails and are intended for people who have six months or less to live. Meanwhile, Switzerland is seeing an increase in suicides among people ages 65 to 85 [“Suicides in Switzerland Quadruple Among Older People”], and an article in The Atlantic says “Canada Is Killing Itself” because so many people have used its medical assistance in dying law. Canada’s MAiD has expanded so quickly that it has “proved to be a case study in momentum,” according to The Atlantic. Canadian journalist Stephanie Nolan wrote a long-form article on the expansion of euthanasia laws globally for the New York Times, “Should You Be Able to Ask a Doctor to Help You Die?” The article is worth reading to gain an overview of how euthanasia laws tend to begin as only for the terminally ill but always expand to other groups.
by James Gallagher, BBC, September 24, 2025
An emotional research team became tearful as they described how data shows the disease was slowed by 75% in patients. ‘It means the decline you would normally expect in one year would take four years after treatment, giving patients decades of “good quality life”’, Prof Sarah Tabrizi told BBC News.
A new treatment for Huntington’s disease has shown remarkable ability to slow down the progression of the disease. This treatment is a gene therapy (AMT-130) that reduces levels of rogue protein that kills neurons. The therapy is administered as a single dose in a 12–18-hour brain surgery. The trial involved 29 patients; results have not been published but were released by the company. The treatment is lifelong since brain cells do not renew like other cells.
by Apoorva Mandavilli, New York Times, August 9, 2025
The investigation into the shooting and the gunman’s potential motives was still in early stages on Saturday [August 9]. But law enforcement officials said that the suspect identified in the shooting had become fixated with the coronavirus vaccine, believing that it was the cause of his physical ailments.
In August, a man opened fire at the CDC, killing a police officer and damaging the property. Many people see this as the result of misinformation and part of a pattern against healthcare workers and civil servants.
Heather Zeiger, “BioethicsNews Stories (July–December 2025),” Dignitas 32, no. 3–4 (2025): 33–35, www.cbhd.org/dignitas-articles/bioethics-news-stories-july-december-2025.