Saturday, February 13, 2010

Best practices?

Blackstone suggested changes to a complex system (specifically English law) should be made only when necessary because the effect of even a small change was unclear. He went so far as to say that just because we could not see a reason why a law was as it was did not mean the law should be changed -- our lack of perception of why something exists does not mean there is no reason for its existence.

Put otherwise, as a priest once told me "tradition works, even if we don't know why".

In this context the bewildering plethora of "best practices" being established in professional fields is problematic.

Sometimes it seems that "best practices" are established more to allow ambitious professionals to show off their knowledge than to improve the profession. Sometimes the "best practices" are drafted without an appreciation for how improving one step in a process harms a subsequent step -- in legal matters this is common enough -- an attempt to improve, say, documentary disclosure has the effect of making lawsuits financially impossible for all be large businesses.

Dr Jerome Groopman's piece in the NYR this week "Health Care: Who Knows Best" is illustrative:

http://www.nybooks.com/articles/23590

But once we depart from such mechanical procedures and impose a single "best practice" on a complex malady, our treatment is too often inadequate. Ironically, the failure of experts to recognize when they overreach can be explained by insights from behavioral economics. I know, because I contributed to a misconceived "best practice."

My early research involved so-called growth factors: proteins that stimulate the bone marrow to produce blood cells. I participated in the development of erythropoietin, the red cell growth factor, as a treatment for anemic cancer patients. Erythropoietin appeared to reduce the anemia, lessening the frequency of transfusion. With other experts, I performed a "meta-analysis," i.e., a study bringing together data from multiple clinical trials. We concluded that erythropoietin significantly improved the health of cancer patients and we recommended it to them as their default option. But our analysis and guidelines were wrong. The benefits ultimately were shown to be minor and the risks of treatment sometimes severe, including stroke and heart attack.

After this failure, I came to realize that I had suffered from a "Pygmalion complex." I had fallen in love with my own work and analytical skills. In behavioral economics, this is called "overconfidence bias," by which we overestimate our ability to analyze information, make accurate estimates, and project outcomes. Experts become intoxicated with their past success and fail to be sufficiently self-critical.

A second flaw in formulating "best practices" is also explained by behavioral economics—"confirmation bias." This is the tendency to discount contradictory data, staying wed to assumptions despite conflicting evidence. Inconsistent findings are rationalized as being "outliers." There were, indeed, other experts who questioned our anemia analysis, arguing that we had hastily come to a conclusion, neglecting findings that conflicted with our position. Those skeptics were right.

Yet a third powerful bias identified in behavioral economics can plague expert panels: this is the "focusing illusion," which occurs when, basing our predictions on a single change in the status quo, we mistakenly forecast dramatic effects on an overall condition. "If only I moved from the Midwest to sunny California, I would be so much happier" is a classical statement of a focusing illusion, proven to be such by studies of people who have actually moved across the country. Another such illusion was the prescription of estrogen as the single remedy to restore feminine youth and prevent heart disease, dementia, and other complications of the complex biology of aging. Such claims turned out to be seriously flawed.
James Morton
1100-5255 Yonge Street
Toronto, Ontario
M2N 6P4

416 225 2777

www.jmortonmusings.blogspot.com

No comments: