Using Data to Identify and Solve Common Program Challenges

By Emily Brennan, Michelle S. Manno, Samantha Steimle

There is a growing push for social services and educational institutions to use their data to improve services and outcomes for participants. Program data from staff and participants—such as management information systems, surveys, feedback forms, and observations—can help inform many questions about your program.

Flashlight icon

We can think about data as a flashlight used to illuminate dark corners: the aspects of program services and processes that might not be working as well as we want but are impossible to see without the right data.

For many programs and agencies, collecting the data is not the main challenge. There’s often a great deal of available data, but many programs don’t always have the time and resources to use it to inform their decision making. This post shares some simple approaches for how to use data as a program improvement technique, referred to at MDRC as Learn-Do-Reflect:

  • Learn: Use data to identify and define a problem in your service flow, and brainstorm possible solutions. 
  • Do: Implement solutions, collecting data along the way to understand what happens and how the people involved feel about it.
  • Reflect: Review data, examine trends and why some approaches are working or not, and decide on course corrections.

We’ll look at some examples of how programs in the Building Bridges and Bonds (B3) evaluation used data as part of their Learn-Do-Reflect framework. B3 is a partnership of six organizations providing responsible fatherhood services, the MDRC-led study team, and the funder. The study highlighted how these programs use data to improve outcomes of participating fathers in an infographic, Using Data to Understand Your Program.

The examples of how programs in B3 used the Learn-Do-Reflect model are from programs implementing Cognitive Behavioral Intervention for Justice Involved Individuals Seeking Employment (CBI-Emp), one model evaluated in the B3 study. CBI-Emp is a series of interactive workshops designed for individuals who have been involved with the justice system to develop interpersonal skills for the workplace.

Learn: Identify a specific problem

Some common questions programs want to know include, “Who is participating?”; “What services are delivered?”; and “How many services did participants receive?” These questions help programs understand current trends and help managers and staff brainstorm potential paths to improvement. All programs and agencies experience client drop-off, though the reasons for decreased participation vary. Data can help pinpoint aspects of programs that need to change to increase retention. Useful starting points include:

  • Define when to start implementing the solution and a period of time to observe it in action.
  • Train staff on new solutions.

How did the B3 programs implementing CBI-Emp tackle this challenge?

Like many other programs and agencies, CBI-Emp faced the challenge of client drop-off. While the reasons for decreased participation can vary for a program, data can help pinpoint aspects of programs that need to change to increase retention.

At the start of the study, participating programs integrated CBI-Emp into their usual services and tracked attendance to see where drop-off occurred. One program saw early on that its clients were not completing the curriculum. Using data from the program’s management information system, the program manager saw that most clients stopped attending after two weeks.

Some barriers might be obvious. For example, if fathers who are employed are dropping out at a higher rate than fathers who are not employed, it might be that workshop schedules are not compatible with their working hours. Consider recording employment status in your management information system.

Flashlight icon

Other barriers may require more digging to understand. For example, some fathers who participated in CBI-Emp were on probation at the time of the study. Suppose fathers on probation are dropping out at a higher rate. In that scenario, caseworkers might consider calling that group to check in and ask what the program might do to make participation easier.

Often, asking the narrowest question possible is the most helpful. For example, when looking at your management information system data and asking yourself, “Why do most fathers complete the first workshop but not attend the second?” consider asking, “Are there groups of fathers who are more likely to drop out after the first session?” For example, are fathers who are currently on probation more engaged or less engaged than fathers who are not currently on probation?

Do: Implement solutions

Once the problem is defined, programs can start testing possible solutions. Programs can test many different solutions to try and solve drops in participation, aligning them with specific barriers fathers identified during the “learn” phase. Some examples include:

  • Offer breakfast at morning sessions, dinner at evening sessions.
  • Pay for transportation.
  • Increase text and email reminders.

As programs start testing solutions, set up a framework to help evaluate outcomes: 

  • Track data that will help measure outcomes.

What solutions did a B3 program implement to increase retention?

One of the programs implementing CBI-Emp saw drop-off occurring for all fathers, not just a specific group of fathers. Before the B3 study began, the program ran for two weeks, but the program extended to three weeks when it implemented CBI-Emp. During the “learn” phase, the program uncovered the participants who thought there were only two weeks of services. To keep clients coming back for the third and last week of CBI-Emp, the front desk staff who had the initial contact with clients needed to be in the loop. The program also moved its “graduation ceremony” to the end of the third week and pushed back the timing of offering and distributing a financial incentive.

Additionally, the program enhanced its service delivery options, including offering additional days and times for sessions, which allowed for more flexibility for fathers who worked. The program also started offering other services for fathers who completed the first two weeks, like computer classes and job development meetings, during the third week of delivery to improve retention.

Reflect: Analyze how solutions worked

After implementing the potential solutions to the identified problem, programs should consider scheduling a meeting to reflect on whether those solutions worked. During this meeting, programs can use data to examine how patterns of participation have changed. You may not be able to establish causality (for example, you may not be able to tell whether bigger incentives caused better participation), but you can see how trends changed over time. Then again, you may not see any trends! No change is also important information: It means different solutions are needed to reach your population. If that happens, circle back to the “learn” and “do” steps once again. With complex problems, you often must test multiple solutions to understand what works for your program and participants.

How did the B3 program analyze the effect of the solutions?

After adjusting the program messaging, the timing of program milestones, and the program delivery schedule, the program managers and direct line staff reviewed attendance and service delivery patterns. This review assessed whether there were any changes in participation following changes in messaging and program delivery.

In reviewing these data, the program staff agreed that messaging could still be improved in some areas, and more resources were needed. For example, a room was not always available for the additional services. In the next Learn-Do-Reflect cycle, the program staff could adjust the timing of the program services to avoid room and schedule conflicts.

 

Did you find this information helpful? See our related post on using data to analyze recruitment and enrollment processes, Filling All the Seats in the Room.

About InPractice

The InPractice blog series highlights lessons from MDRC’s work with programs, featuring posts on recruiting participants and keeping them engaged, supporting provider teams, using data for program improvement, and providing services remotely.

InPractice Topics