The views expressed by contributors are their own and not the view of The Hill

Promoting policies that work: Six steps for the Commission on Evidence-Based Policymaking

On March 30, 2016, in a rare moment of bipartisanship, Congress passed and the President signed an important bill creating the Commission on Evidence-Based Policymaking. The 15-person Commission, chaired by economics professor Katharine Abraham, was given a mandate to study the use of data in evaluating the effectiveness of federal programs and expenditures.

Among the Commission’s most pressing tasks is to identify methods for incorporating randomized evaluations into the design of government programs. Much as randomized controlled trials revolutionized modern medicine in the 20th century, there is an exciting movement in government and philanthropy centered on the use of rigorous evaluation to determine what works — and what does not — in social policy.

{mosads}A key tool in evaluating the efficacy of government programs through randomized controlled trials is the availability of administrative data. Hospitals, governments, school systems, and other institutions gather a wealth of information on individuals for purposes other than research. As the White House recently highlighted, this data can be an excellent source of information for research when equipped with safeguards for privacy. Furthermore, it can reduce research costs, create more possibilities for long-term follow-up, and improve the accuracy of findings. For researchers looking to conduct policy-relevant studies on issues like health care, housing, and education, access to this data can be transformational.

Several deep dives into administrative data have already transformed what we know about social policy — and helped policymakers make better decisions. Economists Raj ChettyNathaniel Hendren, and Lawrence Katz tracked the long-term outcomes of families who left high-poverty areas in 1994 through the housing choice voucher program. By matching income records from nearly 20 years later, they found that young children from families that moved earned significantly higher incomes, attained more education, and became single parents at lower rates compared to their peers who stayed. Citing this research, the Department of Housing & Urban Development overhauled the formula it had been using for four decades to calculate rental assistance and increase opportunities for families to move to low-poverty areas.

In health care, economists Katherine Baicker and Amy Finkelstein used a randomized evaluation to track the behavior of individuals who gained access to Medicaid in a lottery system in Oregon in 2008. They learned that Medicaid led people to use more healthcare services across the board— preventative care, hospitalizations, emergency department visits, doctor office visits, and prescription drug use—while reducing financial strain and rates of depression. This study provided hard fact in the politically charged Medicaid expansion debate and informed federal and state decisions on offering public insurance.

While different levels of government are slowly increasing access to this valuable data, navigating the complex processes to obtain and use administrative data remains a huge challenge. As the Commission holds its next meeting on Nov. 4, key issues related to evaluation will be front and center.

Here are six concrete steps the Commission can take to institutionalize the use of administrative data to support policy-relevant research and evidence-informed policymaking.

The new Administration should appoint or detail at least one individual to the U.S. Census Bureau charged with implementing the Commission’s directive to increase access to administrative data for researchers and facilitate policy-relevant randomized evaluations using this data. This individual should coordinate across the statistical agencies and with the Chief Data Scientist to ensure the widespread adoption of best practices.

Researchers often waste hours of valuable time in lengthy correspondence about whether particular data exist, discouraging studies even before they get off the ground. Streamlining data request procedures can free up capacity for meaningful research.

Linking administrative records from different government agencies enabled the influential reanalysis of the housing voucher lottery discussed above. Researchers conducting similar randomized evaluations need to match individuals at the start of a program (e.g. Housing Authority data) with their respective outcomes (e.g. income records).

Currently, researchers may request data that is only tangentially relevant to their analysis, without realizing that it captures sensitive information that could unnecessarily jeopardize the entire request. Knowing which data can be linked and which are sensitive can direct researchers toward questions they can feasibly answer.

Individual-level data (known as microdata) enable researchers to control for individual characteristics – such as educational attainment or race – to better determine a program’s impact on specific subpopulations. Researchers can use microdata to validate and adjust their analysis as they learn from the data in real time.

The Commission is expressly tasked with determining whether to launch a central clearinghouse where data from various agencies can be organized, linked, and accessed. While this would be a boon to researchers, data security is a top concern. The clearinghouse should be developed in a staged rollout to allow constant feedback from researchers. Following the example set by the Department of Defense’s Hack the Pentagon Pilot Program, in which hackers identified 138 vulnerabilities in Pentagon websites, the Commission could invite hackers to test the clearinghouse’s security.

Analyzing administrative data to adjust government programs may seem like dry, behind-the-scenes work, but when scaled up to redirect national policies, it can have a significant impact on millions of Americans.

Quentin Palfrey is the Executive Director of J-PAL North America at MIT and a former Senior Advisor at the White House Office of Science & Technology Policy.


The views expressed by authors are their own and not the views of The Hill.