1. Section Introduction

Data is a key part of evaluation but for some, the word ‘data’ can be intimidating. This section helps to dispel this by explaining what data is and the different types of data that can be collected in evaluation. Next, we offer ways to develop indicators that will help determine what specific data will be needed to support our evaluation goals. What follows is a primer on data integrity- what it is and tips for including it in evaluation activities. Moving along the continuum of evaluation is a collection of resources outlining the steps and procedures that are a part of analyzing or making sense of the data once it’s gathered. We finish by talking about how to tell stories with our data through data visualization, so that we’re able to share our evaluation findings in a way that is accessible and engaging.

Close

2. What is Data?

What is ‘data’ and where do I get it?

Evaluation always involves gathering data. Data is factual information, is systematically organized and used to help someone make a decision. To conduct a successful evaluation, we need to identify strategies for gathering appropriate data and evidence. The data collected should align with the evaluation objectives and should seek to answer the evaluation questions.

We can begin by looking at the data already being collected to see whether existing data can adequately answer the evaluation questions.

Existing data (secondary data) are data we already collect such as performance measures or data that are gathered by external sources, such as administrative data. Administrative data may include client records and Census data.

If existing data are not available to answer our particular evaluation questions, we’ll need to collect new data (primary data). Most people associate data with numbers: dollars, demographics, per cents or averages. Statistical data, usually called quantitative data, is part of most evaluations, but is by no means the only kind of information that is useful to nonprofit organizations.

Close

3. Qualitative or Quantitative Data

What types of data should we be collecting when we evaluate? Is there one type of data that is considered better than another?

Data gathered for program evaluation can be qualitative or quantitative. Depending on our goals, one type may be better suited to meet our needs, however a mixed method of both quantitative and qualitative data can enrich our evaluation and provide a more comprehensive set of results.

The type of data we collect will influence the plan and approach we take. Depending on the evaluation questions being asked, there are several different types of data collection instruments that can be developed and administered to collect new data.

Tool

Template

 

Close

4. Linking Questions, Outcomes and Indicators

Evaluation questions are intended to broadly define what will be assessed. Evaluation questions can focus on:

  • Planning and implementation (e.g., how well was the program was planned out and how well was the plan put into practice?)
  • Attainment of program objectives (e.g., how well has the program met its stated objectives?)
  • Program impact (e.g., what difference has the program made to its intended targets of change or community as a whole?)

To operationalize evaluation questions, they need to be linked to key program outputs and outcomes. From here, measures or indicators are developed for each one.  An indicator is a factor used to measure or demonstrate change.

Checklist

Tool

Close

5. Data Integrity

What is data integrity and why does it matter?

Data integrity refers to the accuracy and consistency of data. When embarking on an evaluation, it’s important to have processes in place so that the data is captured in a way that preserves its reliability. Being explicit about how data will be managed and checked during an evaluation increases trust in our evaluation results for our own benefit and for that of our stakeholders and community.

The Joint Committee on Standards for Educational Evaluation has published a set of standards for program evaluation. The Program Evaluation Standards include thirty statements in five categories: utility, feasibility, propriety, accuracy, and evaluation accountability. Knowing these standards and applying them during and evaluation activity can give an evaluation credibility with stakeholders and funders.

Tool

Close

6. Making Sense of Your Data

In this section, we’ll learn about the most common quantitative analysis procedures that are used in small program evaluation. We’ve included a template to help plan for data analysis. Following this, we take a deeper dive into different kinds of measurement and learn about a variety of data analysis procedures from basic to advanced. There is much more that can be learned about data analysis, so a resource list specifically about this topic is also included here.

Template

Tool

Knowledge Resource

Close

7. Finding Stories

Data + Story = Evidence

Finding and sharing stories of how our work is affecting the people we serve is an important part of evaluation. It showcases the human and experiential element of our work. Stories are highly relatable and memorable. Those who read our evaluation reports are more likely to remember a story we share than the numbers in a pie chart or graph. It’s just the way we’re wired!

In order for stories to be effective in evaluation, however, we need to think ahead about why we want to collect them and what we plan to do with them. Do we want to share evidence of our impact? Provide validation of a change that we are advocating for?  Get an honest appraisal of what works and what doesn’t in our program?

Once we’re clear on our reasons to collect stories, we can plan for how and where we will gather them. Focus groups and stakeholder interviews can be great places to collect stories during the evaluation process.

Beyond stories we can collect from our staff, volunteers, board members and clients, we can also become attuned to the stories that naturally arise from our data and analysis during evaluation.

Check out the resources that follow to learn about different ways to plan for and utilize storytelling in our evaluation.

Tool

Activity

Close

8. References