At a recent event in Albuquerque, I listened as the university system presented data on their achievements in student retention and graduation rates over the last five years. The numbers were amazing to say the least. Rates in these key areas were trending up in communities of color, but after zooming out, a significant racial wage gap persisted between these same communities and their predominantly white counterparts. Why the disconnect, and how can cities like Albuquerque begin to use their data toward improving these broader community conditions?
As it turns out, the devil—and the story—is in the details when working with a continuous improvement lens. In order to achieve success for the community or system, we have to zoom in on programmatic performance measures that track multiple data points associated with individual people.
On the aggregate, these measures illuminate the steps within a complex journey, including the struggles and triumphs that led to achievements we see in the data. These performance measures allow us to understand the complex systems that intersect and determine outcomes. It is only with this understanding that dramatic impact is possible.
So what happens next? How do we go from population-level data to more granular measures that let us peer into what works and what doesn’t? The answer is complicated, but it has to start with discipline.
Imagine that you run a social services agency. What would it look like if you knew that certain case managers on your team were most effective at getting outcomes for clients who are men of color, while others got better results for immigrants? Armed with this insight, you might change the intake and assignment process for clients, helping you exceed your targets and ultimately achieve exceptional results for the community.
Data discipline is about figuring out how to work smarter with the resources you have.
And if it were truly as simple as I’ve laid out, we would all be using it. The reality is that it is complex and messy, but ultimately mission-critical. Although data discipline and continuous improvement are foundational, funders have not always supported this area of social change work.
Philanthropy has traditionally focused on using data for compliance, making it difficult for community-based organizations to be honest about and learn from mistakes and failure at the risk of their funding. This trend is slowly changing, though, and as philanthropy continues to evolve toward utilizing data for collective impact, data discipline is the next capacity we must master to achieve results for people and communities.
The five tips below form a framework for building data discipline, which I think of as the process of collecting, reporting on, and using data for continuous improvement and decision-making in your organization.
1. Don’t create a new data system. Technology isn’t your silver bullet.
Your most valuable resources are people and time; not technology. The time you set aside time with your team to collect, report on, and discuss your data helps build the foundation for data discipline.
Any technology system you decide to utilize will work as long it allows you to track information for discussion. The Network for Economic Opportunity in New Orleans, for example, collects data on outputs and outcomes from their Opportunity Centers through an Excel document.
If you want to get more formal and are ready for the next step into a real database, there are plenty out there, as long as you don’t expect it to solve all your data problems. Clear Impact provides a Scorecard dashboard view of data which connects directly to Results-Based Accountability frameworks while Salesforce and Social Solutions have other database solutions.
Just because you have purchased a database, it doesn’t mean that people will enter the data into it. The behavior of collecting and using data is one that no technology system can solve.
2. Create a rhythm for reflection and allow it to become a part of your discipline.
Establish a regular pattern for when your team will look critically at the data you are collecting, and do not deviate from it for at least four to seven meetings.
As you start looking critically at data on a regular basis, this repeated action will become second nature. You can start this process on a monthly or quarterly basis, depending on how often your data is updated.
StriveTogether, a national nonprofit network of more than 70 communities using a rigorous approach to accelerate progress and sustain success in education, coaches community leaders to apply a collaborative data-driven process to get better results. In its Postsecondary Enrollment Impact and Improvement Network, teams established efforts to review their data weekly and monthly and revised strategies accordingly to boost FAFSA completion rates. This practice started as a rhythm and became a discipline of tracking, reporting, and utilizing the data to identify trends and make decisions accordingly. Their regular tracking and review process allowed them to test theories and identify what works and doesn’t work fast enough to replicate what works.
The data showed that a higher level of success was triggered when one economics teacher incorporated content about college financial aid into the classroom by providing class time for students to complete their FAFSA applications. The rest of the school then tested this practice before it was identified as a best practice for the community in order to achieve a higher level of college attendance for the graduating senior class.
Without the established rhythm of data reflection, the school district may not have identified this strategy. The teacher may have continued the practice in a vacuum, limiting the impact that this innovation could have for students in other classes.
3. Start where you are. Ask probing questions about the bright spots you see in the data.
There is no reason to build an entirely new data collection and discussion process from scratch. It is much better to start the journey from where you are, and you most already collect and report on some data regardless of where that is. You might have an established process for grant reporting that can serve as a starting point. Partners at collective impact tables might collect or share data within or among organizations. Ask yourself what you’re already collecting.
Under Mayor Bloomberg, New York City departments held “Stat Meetings,” weekly discussions where community-based organizations (CBO) that provide direct services met with city officials to discuss their progress towards the goals in their grant agreements.
During my time managing the Jobs-Plus Program, a workforce development and financial security program focused on residents of public housing in New York City (particularly young men of color), we built our rhythm for data discipline by meeting with each CBO providing direct services to the community on an eight-week cycle. This rhythm was historically used by the City, as the funder for various programs, to stay focused on compliance and ensure CBOs were “doing their job.”
By having a consistent place to discuss data throughout the year and by asking “why?” when monthly goals were or weren’t achieved, the conversations allowed us to look at all the data points across the network of CBOs that were providing services to discuss what was and wasn’t working, and why.
Our primary demographic was young men of color, but through these meetings we found that two out of the five CBOs were attracting more women than men through their outreach efforts. By asking probing questions, we identified that the demographic of the outreach staff was largely female and young, and therefore not achieving their goal of reaching men. This realization prompted the leaders of the program to make changes to the composition of their outreach team, which in turn led to more young men of color joining the program.
By interrogating the bright spots for the program across the three high-performing CBOs, we were able to quickly identify what they were doing differently, and then make recommendations and programmatic adjustments accordingly.
4. Don’t hide your challenges. Remember that the goal is continuous improvement.
Reflecting on the data you are already collecting (whether it is achieving the goals or not) provides an opportunity to reflect on the question “why?” Asking this question over and over allows groups to dig into the hard conversations that ultimately link programmatic performance data to systems change.
In New Orleans, Opportunity Centers use their monthly OpportunityStat meetings to discuss the programs where they are making progress and also those that aren’t. The process of digging into why the data was and was not achieving the goals this leadership group set out to achieve allowed the group of public, private, and nonprofit leaders to identify gaps in the systems which directly affect African American men in New Orleans.
Finding these gaps led the group to develop multi-pronged policies and practices working together to accelerate the reduction of the non-employment rate for African American men. These policies and practices include:
- HireNOLA, a local hiring policy.
- BuildNOLA, a public sector goal from the Mayor’s Office to increase the number of minority-owned construction companies (known in Louisiana as disadvantaged business enterprises, DBE) receiving contracts and subcontracts from the City of New Orleans construction bids.
- Creation of the Mobilization Fund, a pilot set up to as a way to increase access to capital for DBEs when they are awarded construction bids.
5. Be patient.
Data discipline takes time, patience, and accountability, all of which are behavioral and cultural shifts in the way we do our work rather than technical skills.
If you don’t become disciplined with data, you’ll simply drown in the numbers. What does that mean for the communities and people we aim to get results for? In New Orleans, it means 35,000 African-American men who might not gain access to the workforce. In Albuquerque, it means 10,000 people of color kept out of living wage jobs. What will it mean for your community?
If you don’t become disciplined with data, you’ll simply drown in the numbers. While the journey is hard, it’s also worth it.
With the urgency we feel for this work, patience is critical. As the people doing the work, we need to be patient in developing the culture and behavior of data discipline before it truly becomes embedded. We must give our teams and partners time to see the benefits and rewards of doing the work a different way than we have before, and be patient with the feeling of loss and risk our colleagues will feel in the process. While the journey is hard, it’s also worth it.
For more insights on analyzing and presenting data for collective impact initiatives, check out our series on Data and Collective Impact.