Primum Non Nocere and the Hippocratic Oath

HippocraticOathUnless you’ve been under a rock or on a boat in the middle of the ocean1, you’re aware that the United States is in the middle of a measles outbreak that has, so far, infected over 100 people, and was traced back to December Disneyland visits.

There’s been a lot of chatter lately over encouraging adherence to vaccines, lawsuits,2 and so on-and in the ways of the world, in the last 24 hours, people have suddenly shifted to what the Hippocratic Oath says and whether primum non nocere (“do no harm”) is part of the Oath, and what that means for doctors who peddle anti-vaccine beliefs (and in particular, charming Arizona cardiologist and vaccine refuser Jack Wolfson).

As I mentioned on Twitter this morning, this would be a really convenient time to have someone with a piece of paper saying they have a degree in medical history around. (Hi.) So, a quick summary and expansion of this morning’s question and answer:

Is “primum non nocere” part of the Hippocratic Oath?

No, not in the original versions of the Oath that we have. This isn’t to say that the idea of what we would now call the principle of non-maleficence isn’t written in to even the earliest examples of the Oath, merely that the particular phrasing doesn’t show up. What does occur in the early versions of the Oath are phrases like “abstain from harm” – which is pretty close. The phrase “do good and do no harm” does occur in another part of the Hippocratic Collection, the Epidemics.

So what’s the origin of the phrase primum non nocere?

Good question–one that many people have made dissertations and other research projects out of. The last I was reading about this (which admittedly was a few years ago), the general consensus seemed to be that the specific phrase first enters American medical lexicon in the mid-1800s in reference to an earlier medical textbook.

What’s important here, though, at least in terms of talking about contemporary non-maleficence and beneficence, is that the concept behind “do no harm” (regardless of phrasing) has been a part of medicine for a very long time. This is one of the reasons the concept of “not cutting for stone” is in the Hippocratic Oath: removal of kidney stones (the stone being cut) in men used to be a rather brutal, bloody, and deadly procedure, and thus was left to the barber-surgeons, rather than the more refined doctors.

That said, I’d also say it’s equally important to not place a lot of emphasis on the Hippocratic Oath. While it is an incredibly important piece of medical history, it also banned surgery (not just removing kidney stones), providing abortions, and providing deadly medications. Those trained in medicine were expected to train their own sons in medicine, as well as the sons of their teacher – and tuition? Not a thing. Oh, and don’t forget swearing fealty to Apollo. (I wouldn’t want anyone who is anti-choice or anti-euthanasia for religious reasons to get too excited here.)

And of course, all of this ties in to the last, and common, question about the Oath: is the Hippocratic Oath actually a legally binding oath? At least in America, no.

What the Hippocratic Oath is, in many ways, is another living document that is frequently revised to reflect contemporary views–which is why the bits about leaving surgery to the professionals has been taken out–and still contains elements that have been considered essential to the art/techne of medicine for roughly 2500 years. It is a wonderful part of the history and lineage of medicine, connecting what was to what is. What it is not is a place to look for legalistic or even moral answers for contemporary medico-social issues.

  1. True story: I’ve known of major news stories that have happened while people were on a research cave trip and while on a no-internet-except-for-work research cruise in the middle of the ocean, so apparently this happens more than you’d think. []
  2. I highly recommend Dorit Rubinstein Reiss’s paper on this, and am endebted to J.H. for pointing me to it. []

Screening vs Diagnostic – Differentiating Difficulties Lead to Tragedies

I’ve been a relatively vocal critic of unregulated over-the-counter and direct-to-consumer screening kits for years, and moreso in the last few, as 23andMe flirted with the DTC genetic screening market. I felt (and still believe) that yanking the 23andMe kits was necessary because they’d not been validated and had no overight or FDA approval. Perhaps not surprisingly, the most common pushback I received on this1 was that no one would actually use an OTC, DTC, or otherwise unregulated test to make decisions.

This Boston Globe story, by Beth Daley at the New England Center for Investigative Reporting, helps to prove my point: people do make life-changing decisions based on the results of screenings and unregulated (or non-regulated) tests, instead of diagnostic tests. Aside from being a very big problem, this can often be incredibly tragic:

Now, evidence is building that some women are terminating pregnancies based on the screening tests alone. A recent study … found that 6.2 percent of women who received test results showing their fetus at high risk for a chromosomal condition terminated pregnancies without getting a diagnostic test such as an amniocentesis.

And at Stanford University, there have been at least three cases of women aborting healthy fetuses that had received a high-risk screen result. …

In one of the three Stanford cases, the woman actually obtained a confirmatory test and was told the fetus was fine, but aborted anyway because of her faith in the screening company’s accuracy claims. “She felt it couldn’t be wrong.”

And no, these screening kits aren’t subject to regulation, because yay, loopholes. Expect them to be closed in oh, nine years, give or take.

It’s always nice to have another point of data to support an argument.

And yes, possibly I’m humming a revised version of a song from West Side Story, as I idly think about sending this link to people who told me there was just no way people’d make life-changing choices without doctor feedback/approval. I feel petty, oh so petty, I feel petty and witty and bright,...

  1. Well, possibly second-most. I did receive a lot of “it’s my DNA and I’ll do what I want with it” retorts, too. []

OutbreakChat: A Livetweet of a Movie That Gives People Nightmares,…

Outbreak-ForBlog…and probably not for the reason you think. Outbreak is one of those movies people seem to either love or hate (or possibly love to hate); almost everyone I know who has anything to do with public health, infectious diseases, or virology tends to swear up a blue storm when the movie comes up.

So naturally, a group of us are going to watch it in real-time tonight, drinking and live-tweeting our thoughts on Twitter. This will include fact-checks, snark, and almost certainly questions and answers from the crowd-at-large. Who is doing this? Well, you might remember David Shiffman (@whysharksmatter) from my Virtually Speaking Science interview a few months ago; while he might seem like an odd choice to organize this, remember he has significant experience with pop culture/movie portrayals of sharks, mermaids, and other scientifically incorrect portrayals of the ocean.

Tara Haelle (@tarahaelle) is a freelance journalist probably best known for her excellent article that debunks flu myths. She’s written extensively on science and the need for accuracy in media imagery and discussion.

Nicholas Evans (@neva9257) is a post-doctoral bioethicist at the University of Pennsylvania’s Department of Medical Ethics and Health Policy, based in the Perelman School of Medicine. He specializes in biosecurity, bioterrorism, and the ethics of pandemic preparedness, and recently wrote a piece for Slate explaining why Ebola is not a bioweapon, despite media myths. (He’s also my husband.)

And what am I (@rocza) doing involved in this? Well, aside from spending much of the last couple of months educating Twitter about Ebola, blogging extensively about Ebola, and doing Justice Putnam’s “The Morning After” radio show to talk about the ethics of science journalism and Ebola coverage, I once upon a time was pursuing a PhD in bioethics and philosophy, looking at how popular media portrayals of medical issues affects our medical-decision-making (a continuation of my undergraduate thesis on autonomy and medical ethics). I’ve taught courses through pop culture (Stargate and Applied Ethics), and one of my most popular and invited lectures was on why we watch reality TV. I also have a weird affinity for Ebola; I once intended to become a virus hunter, and I’ve been studying Ebola, outbreaks, and the research for going on 20 years.

We are, of course, hoping more people will join in the viewing party-both experts and lay people alike. So pop up some popcorn, grab your favourite beverage of choice, and join us at 8pm ET tonight (#OutbreakChat) to see firsthand what set the foundations for the Ebolanoia that has raced through the world these past few months.

Edited to add: Bingo cards are available on Twitter.

Help Stop Ebola with this One Simple Trick!*

I mean, other than donating to aid organizations that desperately need help, that is.

See, yesterday, it was revealed there was yet another Western person being treated with ZMapp. Yep, that experimental drug that the world supposedly ran out of last week. Except, apparently, when there’s a Briton involved, in which case, someone checked behind the couch cushions, NIH thought to look in an unused cold storage closet, or who knows–because that’s the problem. The world now knows British man Will Pooley received at least one dose of ZMapp and will receive more, and no one has explained how the Royal Free Hospital happened to stumble across these doses that theoretically didn’t exist. In fact, all they’re saying is

[T]he team treating the nurse had sourced the drug through its clinical networks with the help of international colleagues.

-GIF-suspicious-William-Shatner-James-T.-Kirk-Star-Trek-GIFWell, that’s not at all suspicious. Clinical contacts? International experts? Sure, that doesn’t sound at all sketchy.

See, the thing is, we’re going back to risk communication, international relations, and the people who are dying en masse in affected countries who’ve been told that there is just no drug left. When you say “nope, sorry, no drugs left, we are all and completely out of ZMapp” and then manage to suddenly find some when a white British guy needs it, you foster a climate of mistrust–something that’s already a huge issue that doesn’t really need further fuel on the fire.

Which is why, at this point, when these random unaccounted for surprise stores of ZMapp are discovered, there needs to be transparency about where it came from, why we didn’t know about it, and why it was suddenly found. Because otherwise, it sure looks like the double standard of treatment for Westerners vs. native Western Africans is continuing to happen.

(*How does this help to actually stop Ebola? Right now, one of the bigger issues being seen in countries like Liberia and Sierra Leone is a complete lack of trust in Westerner health care workers who are trying to help. Reinforcing the idea that there is a cure for Westerners when people in Liberia, Sierra Leone, and Guinea have been repeatedly told there isn’t a cure for them is going to continue to emphasize this lack of reason to trust, and that trust is an extremely crucial step to all of the very basic things that need to be done to stop this outbreak from spreading any further. At this point, I’m leaning pretty hard on it being unethical for doctors or journalists to report on ZMapp use without also identifying the source of the drug.)

If I’m Gonna Drop Anything, It’ll be Bricks, Not Names

I really hate having to justify myself. I hate having to roll out “credentials” and be constantly challenged on whether or not I have the “right” to discuss philosophy or ethics, or why I am actually offering a bit more than an “opinion,” or the recent favourite, that I’m not just talking about these things because my husband is a postdoc at Penn.

I hate it even more when I see how people treat Nick – even before his affiliations were made public, no one asked him to justify his credentials. No one asked if he had the right to offer opinions, and in fact, few took what he said as opinions. Oh sure, he gets the MY SCIENCE FACTS crowd, but that’s the crowd that’s arguing the validity of ethics as a field, not the validity of Nick discussing ethics.

sexistandabsurdNo one has suggested that he writes about ethics, or thinks he’s able to do so, because of who he is married to.

Some people have suggested that it’s because I don’t specifically call myself an ethicist or bioethicist in my Twitter profile, which is true. I have some issues there, and in particular I don’t want people to make the mistake of assuming I have a PhD, because I don’t.1

But that doesn’t mean I don’t have an education, because I do. I started off studying human psychology and comparative religions, and got about halfway through a dual degree when I had to relocate to another state, putting my education on hold. When I went back to school, it was with an eye towards either communication or epidemiology; I ended up in a strange interdisciplinary department at the University of Washington, the Comparative History of Ideas. My mentor had a degree in the History and Philosophy of Science, and I studied that, with a heavy emphasis in continental philosophy and anthropology, as well as medical history and ethics, in what was, at the time, the Department of Medical History and Ethics. They only offered a minor for undergraduates, but because of my major and my interest, I was allowed to take as many courses as I could, which ended up being equivalent the Master’s students.

During that time, I also started writing about pop culture and ethics for “the school newspaper” – which happened to be the third largest paper in Seattle at the time. I started guest blogging and then actually writing for other bioethics-related blogs, and I started giving invited talks on subjects I’d written on.

My thesis, which neared the length of a dissertation, was required for graduating with honors (which I did, both department and university). Relying heavily on continental philosophers you’ve never heard of, I made an argument against the primacy of autonomy and proposed an affect-centered ethic to take its place.

I went to graduate school, where I ended up writing for yet another bioethics blog. I worked in a bioethics research institute as a research assistant. I learned how to edit academic papers while working at an academic journal, where I also learned how to run an academic journal. I learned how to talk to the media, how to give interviews, how to evaluate timely and relevant topics. I learned how to write about complicated and serious issues in an accessible manner.

I also taught; I started teaching as an undergraduate, and into my graduate years. I taught basic general topics, I taught applied ethics, I taught bioethics. I taught Merleau-Ponty to freshmen and I taught medical ethics to graduate students.

Is that enough hitting over the head, or do I need to start name-dropping? After all, I learned a lot, from a lot of people, many of whom were, or are, considered the best in what they work in.

No, through circumstances, most out of my control, I don’t have a PhD to hit you over the head with when you question my credentials or my ability to talk about ethics in 140 characters. And that’s why, if you want to talk to “an ethicist” for a paper or publication, I’m happy to give you suggestions on who I think is accessible and able to talk on the subject at hand; I do understand the power of a PhD and the ability to cite an institutional affiliation. Do I wish I had that? Of course. But I also understand reality.

It's not just academia where you find this "treat a couple in the same field differently" bias; Emma Stone has spoken quite pointedly on it.

It’s not just academia where you find this “treat a couple in the same field differently” bias; Emma Stone has spoken quite pointedly on it.

Just like I understand the reality of why you question me and my ability to talk about ethics when it doesn’t even cross your mind to do the same with Nick. And it has nothing to do with his PhD, or my lack of.

Unfortunately, the fact that I even had to write that tells me that too many people don’t understand this, or the dynamics we’re working in, at all. Too many people don’t see that they will automatically accept a man as an authority, while automatically suspect that a woman can have any knowledge at all. So a situation is created where women have to be on constant defense, constantly justifying their ability to have more than an opinion.2

There is a difference between “let’s discuss” and “prove it,” one that rests not on tone or language, but on the implicit assumption that discussions happen between people with differing understandings, ideas, and knowledge, whereas someone being told to “prove it” has to meet some unknown, hidden bar of justification just to move on in to the possibility of discussion, and that the person making the demand has the qualifications to make such a determination.

And while there are situations in which “prove it” is appropriate, they are not “when the topic is about ethics and your background, degree, career are nowhere near ethics,” because you don’t have the ability to accurately judge my knowledge of my field.

You know who does?

The people I’ve never once been challenged by,3 in my last decade and change of being publicly involved in philosophical, biomedical ethical issues: other ethicists.

  1. Look at my CV. Look at Google. Piece it together. []
  2. And yes, my irritation and my experience is a small fraction of what minorities, both male and female, have to deal with in academic and professional fields. []
  3. Which is not to say there have never been loud and feisty disagreements. But see the difference between “let’s discuss” and “prove it.” I have never once felt as though I’ve had to prove my right or otherwise justify my ability to discuss ethics with other people in philosophy, ethics, and bioethics–and we’re not talking a giant happy-go-lucky field here, but one where civility is often strained, at best. []