We’ve all become used to warning stickers like “Do not try to insert fishhook into your eyeball” or owner’s manuals for cars chock-full of helpful tips like “Do not allow children to play on roof of vehicle while vehicle is in motion” or “Do not attempt to jump-start the battery while you are soaking in a bathtub.” And you know every one of those tips resulted from some moron somewhere doing just that then suing the manufacturer for not warning them against it.
But can’t we assume that readers of scientific journals and textbooks are smart enough not to do bone-headed things? Well, okay, obviously not. But can’t we assume that any trial judge would assume we’re smart enough, and that any bone-headed thing we do is our own fault and not the fault of a lawyer who forgot to warn us off in six-point type?
I just saw a copy of John K. Kruschke’s Doing Bayesian Analysis: A Tutorial with R and BUGS, published by Academic Press, now part of the digestive system of the Elsevier leviathan. Looks like a great book, but enough about that. Today’s rant is about two paragraphs I happened to notice on the book’s copyright page.
Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods, professional practices, or medical treatment may become necessary.
95% of people reading any Elsevier publication don’t need to be told this — they’ve devoted their careers to making it a reality. The other 5%, the deadwood dinosaurs who think they learned everything they’ll ever need when they went to school in 1870, are not going to be cured by a sentence on a copyright page.
Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information, methods, compounds, or experiments described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility.
Oh, thanks for reminding me. And here I was about to be mindful of the safety of everyone except those I have a professional responsibility for. And it’s a damned good thing you imposed that legal obligation on me on your copyright page, since not a single legislature anywhere in the world has ever thought of making rules about professional responsibility.
Okay, this clod of legalese is clearly aimed at covering Elesevier’s butt from physicians: “If you try to repeat something reported in one of our medical journals and your patient dies, it’s your own fault, so don’t sue us.” And that could actually be a useful warning label for Elsevier publications — while they have published some earth-shattering scientific breakthroughs that redefine the way we conceive of humanity’s place in the universe (like doi:10.1016/j.wocn.2007.10.003., for example), they’ve also been known to take money from pharmaceutical companies to print out shameless drug marketing campaigns masquerading as peer-reviewed journals. So they probably should have a blinking neon “Caveat lector” sticker on every front cover.
But this is a statistics textbook, for crying out loud! I should be mindful of my own safety while learning statistics? Is it more dangerous than I thought it was? Does thinking too hard about conjugate priors lead to brain aneurysms, cardiovascular disease, post-traumatic stress disorder, demonic possession? Does it lead to those things more often than reading legalese does?
Anyways, here’s some free advice for scientific publishers who insist on lawyering up: At least hire lawyers who passed junior-high science. Otherwise they’ll leave gaping loopholes like the one you just left here: you’re so busy disclaiming any responsibility for my use of compounds that you forget to mention elements. Ha, Elsevier legal department! Ha! If any of my patients die after I inject them with arsenic or plutonium, we’re coming after you!
- ↑: Granted, the book probably has somewhere the obligatory illustration of Bayes’ Rule about the fictitious blood-test for a fictitious disease, where if the disease’s incidence in the population is low enough and the test’s false-positive rate is high enough, even getting a positive test still means you probably don’t have the disease. But where in the world are the physicians stupid enough to think, “Despite everything I’ve every heard from my med school profs and my malpractice insurers, I vaguely remember something from my statistics class saying I could ignore all positive tests,” then try to sue the textbook publisher when their patients start dying?