OMB's evidence memo deserves praise
Did you even know there was an evidence agenda? Steve Kelman argues that its obscurity doesn't undercut the importance of the associated memo.
By coincidence, only a few days before I wrote my most recent blog post on use of evidence to improve outcomes in emergency rooms after the Boston Marathon bombings and to reduce binge drinking at Dartmouth College, a memo came out from OMB Director Sylvia Mathews Burwell and three others in the White House about next steps in the evidence and innovation agenda. (Raise your hand if you didn't know there was an evidence agenda.)
Regardless of whether the "evidence agenda" in the White House is well-known, this memo is good news, in two respects. First, evidence is a good thing to help us make decisions. This statement is not necessarily as uncontroversial as it might sound. In political debates, Republicans have often been averse to evidence about phenomena in the natural world (climate change or evolution), while Democrats have often been averse to evidence about government programs, out of a worry that gathering evidence might show a lack of impact.
There is a second piece of good news in the memo as well. Traditionally, those promoting the use of evidence in government have often insisted somewhat dogmatically on what are often very expensive, long-duration, hard-to-execute randomized controlled trials to test the impact of government programs. (In such a trial, program results for people who are randomly assigned to receive a certain government intervention are compared with those who do not. In another form, people who planned to participate but didn't are compared with those who planned to participate and did.)
The new memo, though it still seems to assign special status to such forms of evidence, also notes that companies are increasingly using "quick experiments" that in a fast and inexpensive way test much smaller interventions – the classic example would be different wordings on websites or in letters to people to see which produce a better response. These are particularly suitable for management interventions in workplace practices. They use the same principle of random assignment, but at less cost and time.
But econometric analyses, which use existing data rather than organizing new experiments, can also provide valuable evidence, though they are subject to more methodological worries than random experiments and hence their results are more subject to debate among researchers. Even the kinds of benchmarking studies used by organizations can be helpful, though often a smaller sample size and difficulty in drawing good conclusions about what makes one unit a good performer and another a bad one make conclusions from benchmarking more uncertain.
People who don't like the conclusions that evidence-based studies reach often prefer to shoot the messenger. Yes, there are poorly designed studies sponsored by advocates, intended to produce an outcome favorable to the product or idea being promoted. Yes, there are often disagreements even among scholars about conclusions from studies, though these occur less often with experimental studies, whether random controlled trials or quick experiments. Yes, sometimes it is hard to gather data. And of course there are still values questions left in policy and management choices – is a program with a small impact for a fairly large cost worth it or not?
But surely using data to help inform these decisions is better than the alternatives, which dominate our political and governmental choices. So the OMB memo is to be welcomed.
NEXT STORY: Senate confirms historic GPO nominee