by Hannah Devlin, The Guardian, May 1, 2023
An AI-based decoder that can translate brain activity into a continuous stream of text has been developed, in a breakthrough that allows a person’s thoughts to be read non-invasively for the first time.
In May a research group from the University of Texas at Austin combined ChatGPT technology with functional MRI to see what a person was thinking about or listening to. The technology was only able to describe the ideas or the “gist” of the story the person was experiencing; it was not able to transcribe thoughts. The technique, so far, requires the person to spend hours training the technology on their thought patterns before it can “read” their thoughts. Even though this is not true “mind reading,” the system worked better than any other non-invasive mind-reading tool, demonstrating, for the first time, the ability to read thought processes in real time. This has potential to help people with disabilities who have trouble communicating. However, neurotechnologies could also be used to monitor person’s thoughts, raising ethical concerns over privacy, a topic that was taken up by UNESCO last July (see “Ethics of Neurotechnology: UNESCO, Leaders and Top Experts Call for Solid Governance”).
by Lisa Bannon, Wall Street Journal, June 15, 2023
Artificial intelligence and other high-tech tools, though nascent in most hospitals, are raising difficult questions about who makes decisions in a crisis: the human or the machine?
Increasingly, artificial intelligence is being used in the medical setting. While AI has helped doctors detect abnormalities in image scans or transcribe notes for electronic medical records, in some instances it has also taken the place of human decision making. An oncology nurse at UC Davis Medical Center told the Wall Street Journal that if an algorithm says “Your patient looks septic” she must check the patient even though the algorithm does not provide a reason for the alert. Her fifteen-year tenure as an oncology nurse means that she intuitively knows whether a patient is septic and whether the risk (and the cost) of drawing blood is worth checking or not. Even though she is permitted to override the AI, she will be disciplined if she is wrong. She told the WSJ that she is not demonizing technology but that she feels moral distress when she knows the right thing to do and cannot do it.
by Karen Hao and Deepa Seetharaman, Wall Street Journal, July 24, 2023
In recent years, low-paid workers in East Africa engaged in an often-traumatizing effort to prevent chatbot technology from spitting out offensive or grotesque statements.
Before OpenAI launched ChatGPT, it hired hundreds of content moderates in Kenya to flag and categorize graphic text that was then used to build a filter to constrain ChatGPT from writing obscene and violent material. Several tech companies have used Kenyan workers as content moderators and AI trainers because Kenyans tend to be well-educated, proficient in English, and poor. Rights organizations have said that this amounts to exploitation and an infringement of worker’s rights. Often the content moderators suffer from severe mental health problems and social isolation without adequate compensation and support.
Of course, if bots like ChatGPT learn from what is published online and AI is increasingly used to write online content, the system may become an “existential threat to itself” (“AI Is an Existential Threat to Itself,” The Atlantic).
The Federal Trade Commission opened an investigation into whether OpenAI, the makers of ChatGPT, broke consumer protection laws by using people’s online data to train its product. Several artists, authors, and Getty (images) have sued OpenAI, Stability AI, and Meta for breaking copyright laws (“How Judges, Not Politicians, Could Dictate America’s AI Rules,” MIT Technology Review). Additionally, several artists, authors, and publishers are calling on AI companies to obtain permission and pay creators for profiting from their work (“Outcry against AI Companies Grows over Who Controls Internet’s Content,” Wall Street Journal).
The New Atlantis, a journal on technology and society, Summer 2023 issue included a symposium on AI with an article by TNA editor, Ari Schulman on “Why This AI Moment May Be the Real Deal.”
by Caitlin Owens, Axios, July 31, 2023
Weight-loss drugs like Ozempic have quickly become a cultural phenomenon and a source of hope for people with obesity and related health conditions. But recent reports of potential side effects, including suicidal thoughts, have raised concerns.
Semaglutide-based weight-loss drugs have lost their luster. In May The Atlantic reported on the “Ozempic Burp,” (“Beware the Ozempic Burp”) which was among the top ten most viewed posts on bioethics.com. The weight-loss drug apparently causes a sulfurous smelling burp in some people. The World Health Organization said these drugs are not a “silver bullet” to solve global obesity crisis (“Exclusive: Wegovy, Other Weight Loss Drugs ‘No Silver Bullet’, Says WHO amid Obesity Review,” Reuters). The European Medicines Agency is investigating reports of semaglutide-based drugs causing suicidal thoughts in some people (“Europe Is Probing Whether Ozempic Use Raises Risk of Suicidal Thoughts,” Wall Street Journal). And CNN reported that some people are experiencing stomach paralysis and are unable to eat without vomiting as a result of taking semaglutide drugs for weight loss (“They Took Blockbuster Drugs for Weight Loss and Diabetes. Now Their Stomachs Are Paralyzed”).
A Reuters exclusive showed that most people stop taking semaglutide drugs less than a year after starting them (“Exclusive: Most Patients Using Weight-Loss Drugs Like Wegovy Stop Within a Year, Data Show”), partly because of the side effects and partly because of the exorbitant cost (at least $900 per month). Additionally, the weight loss drugs can cause people under anesthesia to regurgitate food, even if they fasted as directed. Many doctors are also concerned about muscle loss among people over 65 who take the weight loss drugs and that the drugs may decrease absorption of some medications.
by Olivia Goldhill, STAT News, July 3, 2023
Some researchers attribute a therapeutic effect to a combination of physical changes in the brain—though where, exactly, is still uncertain—plus personal experiences while high and in the days following psychedelic treatment. . . . Though there are plans to submit trials on MDMA for PTSD to the Food and Drug Administration for approval later this year, none of the medications has yet been definitively proven to work. And the question of how they work is very much up for debate.
Psychedelic drugs, such as MDMA, psilocybin (magic mushrooms), ketamine, and LSD, have gained popularity as a potential treatment for depression and PTSD. Australia became the first country to allow people with these mental health conditions to be prescribed psychedelics, and the FDA has laid out guidelines for psychedelic drug trials (“FDA Creates Path for Psychedelic Drug Trials,” Axios). However, these drugs work in unknown ways and have caused some people to have horrifying experiences. Furthermore, these drugs often go from therapeutic to recreational, as has been the case among Silicon Valley elites for many years (“Magic Mushrooms. LSD. Ketamine. The Drugs That Power Silicon Valley.” Wall Street Journal).
by Susan Pinker, Wall Street Journal, July 6, 2023
In the paper, published in the journal JAMA Psychiatry in May, Dr. Oskar Hougaard Jefsen of Aarhus University and colleagues showed that people who had previously been diagnosed with cannabis use disorder were almost twice as likely to be diagnosed later with clinical depression. . . . Even more dramatically, the paper also found that people with cannabis use disorder were up to four times as likely to be diagnosed later with bipolar disorder with psychotic symptoms.
A pivotal study from Denmark looked at the medical records in the National Danish Health Registry of 6.5 million people over the age of 16 during the years 1995 to 2021 to determine if cannabis use is linked to mental illness. Their study shows that people who had cannabis use disorder were twice as likely to be diagnosed with clinical depression and four times as likely to be diagnosed with bipolar disorder with psychotic symptoms. Another Denmark study showed that 30% of schizophrenia cases among men aged 21–30 could have been prevented if they had not used cannabis (“Young Men at Highest Risk of Schizophrenia Linked with Cannabis Use Disorder,” NIH). Additionally, another study found that cannabis use during pregnancy may harm the developing infant brain (“Even Early Cannabis Use during Pregnancy Might Shrink Babies,” Gizmodo). These studies are important to consider as cannabis continues to be legalized in more states.
by Hannah Devlin, The Guardian, June 14, 2023
Scientists have created synthetic human embryos using stem cells, in a groundbreaking advance that sidesteps the need for eggs or sperm. Scientists say these model embryos, which resemble those in the earliest stages of human development, could provide a crucial window on the impact of genetic disorders and the biological causes of recurrent miscarriage.
by Philip Ball, Wired, July 23, 2023
Even if it is currently a distant hypothetical prospect, some researchers see no reason why embryo models might not eventually have the potential to develop all the way into a baby. There is no clear scientific or medical reason to allow them to do that, and plenty of ethical and legal reasons not to. But even their use as experimental tools raises urgent questions about regulating them.
Two research groups, one at Cambridge University and the other at the Weizmann Institute of Science in Israel, made human embryo-like entities using embryonic stem cells and induced pluripotent stem cells. These entities were able to grow beyond 8 days when given the proper environment and nutrients. While these embryo-like entities lack important cells for further development (i.e., the trophoblast cells and the yolk sac), the studies bring up several ethical questions about how close to an embryo does something need to be for legal lines, such as the 14-Day Rule, to be in effect?
Heather Zeiger, "Bioethics News Stories (Summer 2023)," Dignitas 30, no. 2 (2023): 25–28, www.cbhd.org/dignitas-articles/bioethics-news-stories-summer-2023.