As the first major effort to use a behavioral economics lens to examine human services programs that serve poor and vulnerable families in the United States, the BIAS project demonstrated the value of applying behavioral insights to improve the efficacy of human services programs.
A Primer for Researchers Working with Education Data
Predictive modeling estimates individuals’ probabilities of future outcomes by building and testing a model using data on similar individuals whose outcomes are already known. The method offers benefits for continuous improvement efforts and efficient allocation of resources. This paper explains MDRC’s framework for using predictive modeling in education.
WorkAdvance connects low-income job seekers to high-demand sectors that offer quality jobs with strong career pathways. This infographic describes the program model and its implementation in four locations and presents encouraging evidence of WorkAdvance’s effects on boosting earnings.
Jobs-Plus – a “place-based,” workforce-development model proven to help public housing residents find employment – is about to be replicated across the country. This infographics depicts the program model, its effects on earnings, and the history of its development over the past 20 years.
A Guide for Researchers
Conducting multiple statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) counteract this problem but can substantially change statistical power. This paper presents methods for estimating multiple definitions of power and presents empirical findings on how power is affected by the use of MTPs.
This infographic describes the Improving Contraceptive Options Now (ICON) demonstration, which is assisting primary care health clinics to better serve patients’ family planning needs by offering women a broader range of effective contraceptive options, including long-acting reversible contraception.
Howard Bloom’s Remarks on Accepting the Peter H. Rossi Award
In a speech before the Association for Public Policy Analysis and Management Conference on November 5, 2010, Howard Bloom, MDRC’s Chief Social Scientist, accepted the Peter H. Rossi Award for Contributions to the Theory or Practice of Program Evaluation.
Strategies for Interpreting and Reporting Intervention Effects on Subgroups
This revised paper examines strategies for interpreting and reporting estimates of intervention effects for subgroups of a study sample. Specifically, the paper considers: why and how subgroup findings are important for applied research, the importance of prespecifying subgroups before analyses are conducted, and the importance of using existing theory and prior research to distinguish between subgroups for which study findings are confirmatory, as opposed to exploratory.
This paper is the first step in a study of instrumental variables analysis with randomized trials to estimate the effects of settings on individuals. The goal of the study is to examine the strengths and weaknesses of the approach and present them in ways that are broadly accessible to applied quantitative social scientists.
In some experimental evaluations of classroom- or school-level interventions, random assignment is conducted at the student level and the program is delivered at the higher level. This paper clarifies the correct causal interpretation of “program impacts” when this study design is used and discusses the implications and limitations of this research design. A real example is used to demonstrate the paper’s key points.