A simple definition for transformation is “a thorough and dramatic change.”[1] When it comes to changes in medicine, Dickens’ opening line to his famous novel, A Tale of Two Cities, applies. “It was the best of times, it was the worst of times.” I do not need to tell you that healthcare has changed. When you are ill or have a symptom that causes concern you want to see your personal physician—a wise, trusted, compassionate, brilliant clinician—who has cared for you over many years. He sits in his office chair facing you and asks a few questions about your family and how things are going in general. Then he carefully investigates your story and performs a thorough physical exam. His attention is focused on you alone. You can tell that he is willing to spend whatever time is necessary to get to the bottom of your problem. At least that is what you would like to see.
How was your last doctor’s appointment? For most of us, if we get an uninterrupted 10 minutes with our doctor we consider it a lengthy visit. Of course, instead of looking into the caring eyes of your physician, you get to stare at his back while he spends most of the visit pecking at the computer keyboard.
“So, what brings you in today, Mrs. Johnson?” your physician asks, while he squints at the small screen in front of him.
“My husband died last week, his business went bankrupt, and I am not sleeping well.”
“Mm hmmm. Let me just type that in here. All right. And how are you feeling?” He still has not turned to face you.
“Depressed, alone, like no one really cares or listens.”
“OK. Sure, sure,” he says, without looking to see your pained expression. “Oh, hmm. Look at that,” he says while staring intently at the screen.
“What. What is it?” “It says here that you haven’t been in for a dental checkup in over 5 years. Is that true?”
“Well, I don’t know. I guess so.”
“All right. Be sure you get in to see your dentist. Now, what else can I help you with?”
Although this dialogue is a bit of an exaggeration, I wonder how many of my patients feel like Mrs. Johnson.
Some would like to return to the days of practice as depicted by the 1970s medical drama Marcus Welby, MD. We could easily envision a patient noting something to the effect of: “I remember when I could get in to see my doctor anytime I wanted. And he didn’t rush me. He took all the time I needed. I sure wish it was like it used to be.”
Before we become too critical of our present situation, though, we should think a little bit more carefully about what it was really like in the past. We must take into account the progress that has been made in the treatment of all sorts of human ills. In many ways it has been the best of times. If we were to go back in time, where would we want to go?
How about the early 1800s? In Kentucky, in December of 1809, a woman named Jane Todd Crawford was suffering from the effects of a large mass in her pelvis. At that time, it was felt by the leading surgeons in the east that abdominal surgery was impossible. There was good reason for them to believe so. There had never been a successful one. Either the patient died during surgery or later from an abdominal infection.
A Kentucky physician, Ephraim McDowell, was called to help Mrs. Crawford, and she begged him to try surgery in order to keep her from dying a slow and painful death. McDowell had been educated under British and American physicians, but he had never been granted a true medical degree.
After unsuccessfully trying to talk Mrs. Crawford out of surgery, McDowell agreed to do the procedure if she would come to his home in Danville. There, on the kitchen table, using no anesthesia, no IV fluids, no antiseptics, and no sterile drapes or gloves, McDowell removed a 22-pound ovarian tumor. It would be another 40 years before anesthesia was available. Outside his house, a lynch mob waited to hang the poor doctor if he was unsuccessful. Mrs. Crawford sang hymns during the 25-minute procedure. Five days later she was up making the bed. She lived another 32 years. In accounting for the operation’s success, McDowell wrote, “I can only say that the blessing of God has rested on my efforts.”[2] His accomplishments stunned the medical establishment back in the east. He had a sterling reputation in Kentucky, but, of course, that made no difference to the Ivy League-trained physicians who regarded the skills of a country physician in Kentucky as not much better than that of a local barber. They would not accept that the first successful abdominal surgery was performed by this back-country doctor. In 1830 McDowell died from appendicitis. It would be another 60 years before appendectomies were considered appropriate treatment for an inflamed appendix.
OK, so maybe you do not want to go that far back. But how far back would you go? A hundred years? Life expectancy in 1900 was 47 years.[3] Today an average man lives to be 77, and women live to 81.[4] In 1900, pneumonia, tuberculosis, and diarrhea caused one third of all deaths, and over 30% of all deaths occurred in children younger than 5. Now only 1.4% of deaths occur in small children.[5] With vaccines, the implementation of public health measures, and antibiotics, we are no longer under the domination of viruses and bacteria in this country.
So, maybe you would not want to go back to the early 1900s. How about the 1990s? Maybe you could go back just 20 years. A report was published a little over a year ago in which researchers analyzed data on just over 1 million patients in the U.S. diagnosed with cancer of the breast, colon or rectum, prostate, lung, liver, pancreas, or ovary from 1990 to 2009. They found that the odds of survival increased significantly for many patients. For example, consider patients 50 to 64 years old. Patients from this age group diagnosed with colon and rectal cancer from 2005–2009 had a 43 percent lower risk of death, compared with similar patients diagnosed from 1990–1994. The reduction in risk of death was 52 percent for breast cancer, 39 percent for liver cancer and 68 percent for prostate cancer in 2005–2009, compared to 1990–1994.[6] Those numbers are even better in 2016.
Look how far we have come and what is available today that previous generations knew nothing about: vaccines, antibiotics, dialysis, cancer treatment, angioplasty, coronary artery bypass, and organ transplants. What is my point? If it were not for the dramatic changes that have occurred in healthcare, many of our family members and friends would not be here with us now. In many ways we truly are in the best of times.
Unfortunately, the story is not just a positive one, is it? In many ways it has also been the worst of times. Just a brief review of the last century provides a stark reminder that the potential for cruelty and inhumane treatment of our fellow man is always lurking. The ghastly experiments and extermination of the Jews and others would not have occurred without the cooperation of the medical community in Germany. But, it was not just Germany. Forced sterilizations were carried out in this country, up until the 1980s, backed by an 8–1 decision of our Supreme Court in Buck v. Bell. And, in the 1930s, the Public Health Service began a study in which they observed and recorded the effects of syphilis on 400 African American men in rural Alabama over a period of 40 years. None of the men infected were ever told they had the disease, and none were treated with penicillin even after the antibiotic was proven to be successful in treating syphilis.[7]
In 2007, the Commonwealth Fund conducted a large survey comparing the healthcare attitudes and experiences of people across seven countries: Australia, New Zealand, the United Kingdom, Germany, the Netherlands, Canada and the United States. Of the seven countries, Americans were the least likely to report being “relatively satisfied” with their healthcare.[8] What are the biggest problems with healthcare?
Number 1: the cost is too high. A 2015 study by The Commonwealth Fund found that although the U.S. healthcare system is the most expensive in the world, it ranks last on most dimensions of performance when compared with 12 other leading industrial nations.[9]
I am not a healthcare finance expert, so I do not know if we are spending too much on healthcare. For instance, is 17% of the GDP too high a societal cost?[10] We could also debate whether or not healthcare in our country is truly worse than that of Norway. What I do know is that out-of-pocket costs for healthcare for a family of four can be as much as $12,000 per year.[11] Compare that to the median household income of $55,000 and you will quickly see the problem.[12] Nearly two-thirds of all bankruptcies are linked to inability to pay medical bills.[13]
Furthermore, hospital costs are almost impossible to understand. How many of you have ever been treated with IV fluids? A bag of saline contains a liter of sterile water and about two teaspoons of salt. It costs about 75 cents to produce. By the time it makes its way from the manufacturer to the IV pole and into your arm, 75 cents magically convert into approximately $91 per liter.[14] Try getting someone to explain that to you.
But, it is not just the cost, it is access, too. Despite the Affordable Care Act, over 11% of the population remains uninsured, and that does not count the millions of illegal immigrants.[15] If you live in the inner city or in rural America, you will experience great difficulty finding a primary care physician. Specialists are even more scarce. As the population continues to age, more physicians and nurse practitioners will be needed to care for senior citizens. Severe physician shortages are predicted by 2025.[16] And, for the tens of millions who need mental health services, adequate insurance coverage is almost non-existent.
There are also healthcare disparities. The CDC reports that
residents in mostly minority communities continue to have lower socioeconomic status, greater barriers to health-care access, and greater risks for, and burden of, disease compared with the general population living in the same county or state. Both the 2012 National Healthcare Disparities Report and the 2012 National Healthcare Quality Report found that almost none of the disparities in access to care are improving.[17]
As a result, large numbers of our population are not seeing the benefits of modern healthcare that the rest of us experience. For example, cardiovascular disease is the leading cause of death in the United States. Non-Hispanic black adults are at least 50% more likely to die of heart disease or stroke prematurely than their non-Hispanic white counterparts.[18] Or, consider a second example. Infant mortality rates for non-Hispanic blacks are more than double the rate for non-Hispanic whites. Rates also vary geographically, with higher rates in the South and Midwest than in other parts of the country.[19] And this does not even address disparities in healthcare in developing countries. Indeed, it has not been the best of times for everyone.
It was the best of times. It was the worst of times. That is the status of healthcare today.
Robert Wachter, the former chair of the American Board of Internal Medicine, wrote The Digital Doctor, a fascinating account of the evolution of information technology in medicine. Based on his research and interviews with 100 experts in almost every field that touches on medicine, he paints a picture of what the future of medicine will look like.[20]
In the future, there will be far fewer hospitals, because most patients will get their care at home or in “less intensive community-based settings.”[21] The few remaining hospitals will organize under “major national brands,” where patients will go for major surgeries and receive treatment for critical illnesses. Each bed in these facilities will be designed with the necessary technology to care for critical patients, eliminating the need for a separate intensive care unit. The hospital rooms themselves will each have “wall-sized video screens” and high resolution cameras to allow extreme close-ups so that a physician can perform examinations and communicate with family members in the room or to join from remote locations.[22]
Wachter goes on to suggest that no call button will be needed. “A patient will simply say, “Nurse, I’m in pain,” and the nurse will appear on the screen, discuss the issue and increase the pain medicine if necessary. None of this will require the nurse to enter the room—a computer-entered order will adjust the IV infusion pump automatically.”[23] Pills will be delivered by robots.
He predicts that despite these technological changes, physicians will still make bedside rounds, but with the added benefit that whenever a nurse or physician or technician walks into the room, their names and credentials will immediately appear on the screen. Such changes will allow for consultations to be arranged quickly and conducted by videoconference with the best available specialist, regardless of whether the consultant is in the same building or even the same state.
The electronic health record, too, will evolve in this new paradigm of medicine. Notes will be added primarily through speaking, rather than typing and clicking. And, rather than having each nurse, therapist, and consultant repeat the same information in independent entries in the record, the notes “will be a living document . . . more like a Wikipedia page” that will be collaborative and easily accessible.[24] To allow for that, billing requirements of course will change as well.
Most primary care will occur at home. A mom with a child who has an earache will be able to look in the child’s ear and beam the image to a nurse-practitioner or physician who will diagnose it and prescribe a treatment. Patients with chronic diseases will have multiple devices at home to monitor fluid status, vital signs, and blood tests. In fact, Wachter anticipates that many of today’s blood tests will actually be replaced by skin sensors.[25]
The technology will assist with increasing patient compliance as well. Verbal instructions will be given to patients throughout the day, by their personal computer, to remind them to take meds, what diet to follow, and the need to do certain exercises. Patients will be questioned at different times during the day to find out how they are doing. Rarely will a physician have to be directly involved, and most visits will be done remotely through video.[26]
With all of these developments, Wachter believes that finding new cures and treatments will occur more rapidly, because medical research itself will be transformed. Determining the best treatment for high blood pressure, high cholesterol, leukemia or any other medical illness will no longer require expensive and prolonged clinical trials involving only a few hundred subjects. Information technology will allow researchers to have access to vast amounts of data on millions of patients. Tests and treatments from around the world will be analyzed almost immediately, and those with the best outcomes will be identified and “fed back into the delivery system to influence guidelines and protocols.”[27]
How accurate are Wachter’s predictions? Time will tell. What appears to be inevitable is that medicine will continue to change, rapidly. There is no going back. But just as sure as the inevitability of change in medicine is this truth: If we are to prevent the mistakes of the past and insure that the medicine of the future promotes human flourishing, some things must not change. Let me describe two fundamental principles that must not change.
First, we must hold fast to the moral foundation of our profession.
We must continue to ask, and correctly answer, the question, What is medicine? What is its purpose? Is medicine simply a commercial enterprise where highly skilled technicians exchange their services to their customers for an agreed-upon price? If that is the basis for the future practice of medicine, then the goal of the medical school admissions committee will simply be to admit only those individuals who have the highest GPAs and MCAT scores and the greatest hand-eye coordination. Intuitively, we know that medicine is about more than a contractual arrangement in the marketplace. To explain what we know in our gut we must go back to the beginning, to the oath developed by a small group of Greek physicians in the 4th century B.C.
Physicians who took the Hippocratic Oath swore to the gods that they would honor and care for their teachers, avoid prescribing poison or abortions to those who asked for them, and live and practice with integrity. Most physicians today regard the oath as an interesting artifact of history, but not much more. They no longer recite it other than for tradition’s sake. But to see the significance of the oath, you must look past some of the obscure wording and references to mythological beings, to the heart of the message. A bit of historical context helps.
We know that the Hippocratic Oath did not reflect the way most physicians practiced in ancient Greece. In fact, the oath was the work of a minority of physicians whose ethics stood in stark contrast to the practice of medicine in that day. They were part of a reform movement. As Allen Verhey said,
For centuries before the oath, ancient physicians had provided poison for those whom they could not heal, had counted abortifacients among the tools of their trade, and had been disposed to the use of the knife instead of the less invasive use of dietetics and pharmacology. Moreover, they had sometimes been guilty of injustice and mischief toward their patients, and sometimes quite shamelessly broken confidences.[28]
What began as a call for reform by a minority of Greek physicians spread throughout the ancient world, even before the rise of Christianity, and, whether we realize it or not, still forms the foundation for medical practice today. What was it about the oath that was so compelling that its principles eventually dominated the practice of Western medicine? It is this: the essence of medicine is a moral commitment; not a business deal and not just an application of skillful techniques. A moral commitment to what? Nigel Cameron explains that the power of the oath lies in its conviction that the physician is a healer. The third paragraph of the Oath begins, “I will use treatment to help the sick according to my ability and judgment, but never with a view to injury and wrong-doing. Neither will I administer a poison to anybody when asked to do so, nor will I suggest such a course. Similarly, I will not give to a woman a pessary to cause abortion.”[29]
Although the principle of first, do no harm, is not explicitly stated in the oath, it might as well have been, because the oath spells out the two fundamental harms of euthanasia and abortion as well as other more general harms. In fact, the “prohibition of the medical harms,” says Cameron, “more than all else, sets the practice of Hippocratism apart from that of any other kind of medicine.”[30] The physician “binds herself irrevocably to a medical practice which excludes participation in the taking of human life.”[31] The Hippocratic tradition is a healing tradition.
The moral foundation of medicine is the acceptance of the truth that a physician has an obligation to heal and not to harm. And that obligation to heal is derived from the simple fact that the sick need a physician. Edmund Pellegrino explains that without some significant measure of health, human beings cannot flourish, and that “Those who are ill . . . suffer insult to their whole being.”[32] She is threatened by death or disability, pain and limitation, and finds herself in a completely vulnerable state, regardless of her political or social status. She is at the mercy of the integrity, competence, or motivation of others, most of whom are strangers. These undeniable facts about those who are sick are the basis for the physician’s duty as a healer.[33]
As medicine moves into the future, we must hold fast to the moral foundation of our profession. We are to heal and not harm.
Second, we must defend the view of the inherent dignity of human beings.
As is commonly known, the field of bioethics emerged from the debates over human dignity and human rights that occurred in the aftermath of World War II. Two years after the Nuremberg Doctors Trial, in 1948, the United Nations General Assembly adopted the Universal Declaration of Human Rights which proclaimed that “recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice and peace in the world.”[34] The principles of that document are based on a particular understanding of human dignity: inherent dignity—dignity that is present just by being human.
The concept of inherent human dignity, however, faces a serious battle in bioethics, and the outcome of that battle will determine the future direction of medicine and biotechnology. In appealing to the need to defend a proper view of human dignity, I want to frame my comments with a question: What makes human beings valuable? How we answer that question will determine whether we use biotechnology to treat human beings with dignity and full moral respect, or as creatures of relative value, ultimately dispensable for the benefit of the greater good. Recent history provides us with sobering examples of what happens when human beings are treated as less than human.
What makes human beings valuable? In general, we answer the question in one ofn two ways. Some of us hold to the belief that they are valuable simply because they are human beings. This view is consistent with the words of the UN Declaration and is referred to as the substance view of personhood. Others value human beings for what they can do. This is the functional view of personhood.
The substance view of personhood contends that a human being is intrinsically valuable just because of the kind of entity it is, not because it possesses any particular set of qualities or characteristics. To describe an organism’s substance is to discuss its nature or essence and to distinguish the kind of thing it is from the different qualities or traits it might possess accidentally. In their book Embryo, Robert George and Christopher Tollefsen use the example of a dog to help explain the substance view of persons.
Your family dog began to exist when that specific dog began to exist, and he will cease to exist when that specific dog dies. He did not begin to exist when he developed his teeth, or when he first moved into your home, or when he started responding to the name, “Jake.” Although he will develop into a more mature dog over the years—he will become larger, run faster, eat more—he will never be more of a dog than he was when he first began to exist. His substance is “dog.” His accidental qualities are things like hair and teeth and the ability to run, and even if he never developed some of these accidental characteristics, he would still be the same dog. George and Tollefsen note that when a family speaks about their dog, they do not describe him as a “something” with hair and teeth, but as a dog. They recognize him as the same dog when he is fifteen years old as he was when he was two, even though much has changed about him in the intervening years.[35]
In the case of human beings, the implications are clear. As adults, we are the same entity as when we were embryos. We did not become human beings when we started to talk or perform arithmetic, but when we first began to exist. Since the moment we were conceived there has been no change in our substance, our essence. Over time, we develop into more mature human beings and acquire a large variety of accidental characteristics according to the kind of entity we are. We become aware of our surroundings, grow hair and teeth, and learn to communicate and reason, but we never become more human than we were as embryos. Our substance is “human,” and we remain the same human being from the point of conception until we die.
Supporters of the functional view of personhood claim that human beings become valuable once they acquire certain characteristics or abilities. Philosopher Mary Anne Warren typifies this perspective when she says that a person, as distinguished from a human being, must have the following traits: 1) consciousness; 2) the ability to reason; 3) self-motivated activity; 4) the capacity to communicate; 5) the presence of self-awareness. Since the fetus, even at eight months, lacks these qualities, Warren concludes:
that if the right to life of a fetus is to be based upon its resemblance to a person, then it cannot be said to have any more right to life than, let us say, a newborn guppy, and that a right of that magnitude could never override a woman’s right to obtain an abortion, at any stage of her pregnancy.[36]
There are many reasons to reject a functional view of personhood, but let me give you just one. The functional definition leads to a conclusion that most people simply cannot accept. Since infants lack most of the qualities said to be required of personhood, infanticide would not be inherently immoral, particularly if killing the infant leads to benefit for others. Such an inference, however repugnant, is logically consistent with the functional view. Philosopher Peter Singer writes that,
we should put aside feelings based on the small, helpless, and—sometimes—cute, appearance of human infants. To think that the lives of infants are of special value because infants are small and cute is on a par with thinking that a baby seal, with its soft white fur coat and large round eyes deserves greater protection than a gorilla, who lacks these attributes. . . . If we can put aside these emotionally moving but strictly irrelevant aspects of the killing of a baby, we can see that the grounds for not killing persons do not apply to newborn infants.[37]
The definition of personhood, while obviously important in the debates over abortion and embryonic stem cell research, is just as important in discussions over the morality of end-of-life issues like physician- assisted suicide. Advocates for physician- assisted suicide use the functional view of personhood to argue that helping a terminally-ill human being commit suicide does not violate human dignity because the terminally ill individual is no longer a person.
To adopt a functional definition of personhood is to open the door to the concept that it is acceptable to use human beings as mere instruments for the greater good. To hold to a substance view, however, is to recognize that there is no distinction between a human being and a person. Human beings, as the UN Declaration says, are valuable, not because of what they can do, but because of who they are.
As we cautiously look forward to the benefits of the transformations in healthcare that will take place in the next 20 years, we must hold on to these foundational principles in order to keep us from losing our direction and being swept away by the promise of progress.
I close with the words of Thomas Sydenham, a physician in the late 1600s, whose methods of investigation into the sickness of his patients, earned him the title of the English Hippocrates.
It becomes every man who purposes to give himself to the care of others, seriously to consider the four following things: — First, that he must one day give an account to the Supreme Judge of all the lives entrusted to his care. Secondly, that all his skill and knowledge and energy as they have been given him by God, so they should be exercised for His glory and the good of mankind, and not for mere gain or ambition. Thirdly, and not more beautifully than truly, let him reflect that he has undertaken the care of no mean creature, for, in order that he may estimate the value, the greatness of the human race, the only begotten Son of God became himself a man, and thus ennobled it with His divine dignity, and, far more than this, died to redeem it. And, fourthly, that the doctor, being himself a mortal man, should be diligent and tender in relieving his suffering patients, inasmuch as he himself must one day be a like sufferer.[38]
[1] English Oxford Living Dictionaries, “Transformation,” https://en.oxforddictionaries.com/definition/us/transformation (accessed January 19, 2017).
[2] James Thomas Flexner, Doctors on Horseback: Pioneers of American Medicine (New York: Fordham University Press, 1992 [1937]), 158.
[3] Rebecca Tippett, “Mortality and Cause of Death, 1900 v. 2010,” Carolina Demography, Carolina Population Center, UNC-Chapel Hill, June 16, 2014, http://demography.cpc.unc.edu/2014/06/16/mortality-and-cause-of-death-1900-v-2010/ (accessed January 17, 2017).
[4] Sherry L. Murphy et al., “Mortality in the United States, 2014.” NCHS Data Brief 229 (December 2015), https://www.cdc.gov/nchs/data/databriefs/db229.pdf (accessed January 17, 2017).
[5] Institute of Medicine Committee on Palliative and End-of-Life Care for Children and Their Families, When Children Die: Improving Palliative and End-of-Life Care for Children and Their Families, ed. Marilyn J. Field and Richard E. Behrman (Washington, D.C.: National Academies Press, 2003), 42.
[6] Chenjie Zeng et al, “Disparities by Race, Age, and Sex in the Improvement of Survival for Major Cancers: Results From the National Cancer Institute Surveillance, Epidemiology, and End Results (SEER) Program in the United States, 1990 to 2010,” JAMA Oncology 1, no. 1 (2015): 88–96.
[7] CDC, “U.S. Public Health Service Syphilis Study at Tuskegee,” December 30, 2013, https://www.cdc.gov/tuskegee/timeline.htm (accessed January 19, 2017).
[8] Cathy Schoen et al, “Toward Higher-Performance Health Systems: Adults’ Health Care Experiences in Seven Countries, 2007,” Health Affairs 26, no. 6 (2007): w717–w734.
[9] David Squires and Chloe Anderson, “U.S. Health Care from a Global Perspective: Spending, Use of Services, Prices, and Health in 13 Countries,” The Commonwealth Fund, October 2015, http://www.commonwealthfund.org/publications/issue-briefs/2015/oct/us-health-care-from-a-global-perspective (accessed January 19, 2017).
[10] Ibid.
[11] Gary Claxton, Matthew Rae, and Nirmita Panchal, “Consumer Assets and Patient Cost Sharing,” The Kaiser Family Foundation, March 11, 2015, http://kff.org/private-insurance/issuebrief/consumer-assets-and-patient-cost-sharing/ (accessed January 19, 2017).
[12] Kirby G. Posey, “Household Income: 2015,” American Community Survey Briefs, September 2016, https://www.census.gov/content/dam/Census/library/publications/2016/demo/acsbr15-02.pdf (accessed January 16, 2017).
[13] David Himmelstein et al., “Medical Bankruptcy in the United States, 2007: Results of a National Study,” The American Journal of Medicine 122, no. 8 (2009), doi:10.1016/j.amjmed.2009.04.012.
[14] Nina Bernstein, “How to Charge $546 for Six Liters of Saltwater,” The New York Times, August 25, 2013, http://www.nytimes.com/2013/08/27/health/exploring-salines-secret-costs.html (accessed January 16, 2017).
[15] Steven P. Wallace et al., “Undocumented and Uninsured: Barriers to Affordable Care for Immigrant Populations,” UCLA Center for Health Policy Research, August 2013, http://www.commonwealthfund.org/~/media/Files/Publications/Fund%20Report/2013/Aug/1699_Wallace_undocumented_uninsured_barriers_immigrants_v2.pdf (accessed January 16, 2017).
[16] Lenny Bernstein, “U.S. Faces 90,000 Doctor Shortage by 2025, Medical School Association Warns,” The Washington Post, March 3, 2015, https://www.washingtonpost.com/news/to-your-health/wp/2015/03/03/u-s-faces-90000-doctor-shortage-by-2025-medical-school-association-warns/?utm_term=.dd302925c6a0 (accessed January 16, 2017).
[17] Pamela A. Meyer, Paula W. Yoon, and Rachel B. Kaufman, “Introduction: CDC Health Disparities and Inequalities Report,” Morbidity and Mortality Weekly Report 62, Supplement 3 (2013): 3–5, https://www.cdc.gov/mmwr/preview/mmwrhtml/su6203a2.htm (accessed January 23, 2017).
[18] CDC, “Coronary Heart Disease and Stroke Deaths—United States, 2009,” in: “CDC Health Disparities and Inequalities report—United States, 2013,” Morbidity and Mortality Weekly Report 62, Supplement 3 (2013): 155–158.
[19] CDC, “Infant Deaths—United States, 2005- 2008,” in: “CDC Health Disparities and Inequalities Report—United States, 2013,” Morbidity and Mortality Weekly Report 62, Supplement 3 (2013): 169–172.
[20] Robert Wachter, The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age (NY: McGraw-Hill, 2015), 257ff.
[21] Ibid, 258.
[22] Ibid, 258–259.
[23] Ibid, 259.
[24] Ibid, 260.
[25] Ibid, 261.
[26] Ibid, 262–263.
[27] Ibid, 263–264.
[28] Allen Verhey, “The Doctor’s Oath—And a Christian Swearing It,” in On Moral Medicine: Theological Perspectives in Medical Ethics, 3rd ed., ed. M. Therese Lysaught and Joseph J. Kotva, Jr. (Grand Rapids: Eerdmans, 2012), 225.
[29] Hippocrates of Cos, “The Oath,” Loeb Classical Library 147 (1923): 298–299, doi:10.4159/DLCL.hippocrates_cos-oath.1923 (accessed January 17, 2017).
[30] Nigel M. de S. Cameron, The New Medicine: Life and Death After Hippocrates (Wheaton, IL: Crossway Books, 1991), 60.
[31] Ibid.
[32] Edmund D. Pellegrino, The Philosophy of Medicine Reborn: A Pellegrino Reader, ed. H. Tristam Engelhardt, Jr. and Fabrice Jotterand, (Notre Dame, IN: University of Notre Dame Press, 2008), 94.
[33] Ibid, 94–97.
[34] UN General Assembly, “Universal Declaration of Human Rights,” 217 (III) A (Paris, 1948), http:// www.un.org/en/universal-declaration-human-rights/ (accessed January 10, 2017).
[35] Robert P. George and Christopher Tollefsen, Embryo: A Defense of Human Life (New York: Doubleday, 2008), 59-60.
[36] Mary Ann Warren, “Mary Ann Warren Argues that Fetuses Don’t Qualify as Persons with a Right to Life,” in Contemporary Moral and Social Issues: An Introduction through Original Fiction, Discussion, and Readings, ed. Thomas D. Davis (West Sussex: John Wiley and Sons, 2014), 264.
[37] Peter Singer, Practical Ethics, 3rd ed. (Cambridge: Cambridge University Press, 2011), 152.
[38] Thomas Sydenham, “The Doctor,” in On Moral Medicine: Theological Perspectives in Medical Ethics, 2nd ed., ed. Stephen E. Lammers and Allen Verhey (Grand Rapids: Eerdmans, 1998), 145.
Cheyn D. Onarecker, "Transformations in Care,” Dignitas 23, no. 4 (2016): 1, 4–9.