STEP 5: Evaluation

Step 5

Evaluation is the systematic collection and analysis of information about intervention activities, characteristics, and outcomes. Evaluation activities help groups describe what they plan to do, monitor what they are doing, and identify needed improvements.

The results of an evaluation can be used to assist in sustainability planning, including determining what efforts are going well and should be sustained, and showing sponsors that resources are being used wisely.

Five Functions of Evaluation

Program evaluations are often conducted in response to a grant or other funding requirement. As a result, reporting may be structured only to address the requirement rather than to provide a functional flow of information among partners and supporters.

A comprehensive and well-rounded evaluation process gathers information to accomplish five key functions:58

  • Improvement. This is the most important function of an evaluation—improving the efficiency and effectiveness of your chosen strategies and how they are implemented.
  • Coordination. The evaluation process assesses the functioning of your group, allowing partners to know what the others are doing, how this work fits with their own actions and goals, and what opportunities exist for working together in the future.
  • Accountability. Are the identified outcomes being reached? A good evaluation allows your group to describe its contribution to important population-level change.
  • Celebration. This function is all too often ignored. The path to reducing drug use at the community level is not easy, so a stated aim of any evaluation process should be to collect information that allows your group to celebrate its accomplishments.
  • Sustainability. A thorough evaluation can help you provide important information to the community and to various funders, which promotes the sustainability of both your group and its strategies.

To accomplish these five functions, you need to provide information to the appropriate stakeholders so that they make better choices (improvement), work more closely with your partners (coordination), demonstrate that commitments have been met (accountability), honor your team’s work (celebration), and show community leaders why they should remain invested in the coalition process (sustainability).

Engaging Stakeholders

Evaluation cannot be done in isolation. Almost everything done in community health and development work involves partnerships—alliances among different organizations, board members, those affected by the problem, and others who each bring unique perspectives. When stakeholders are not appropriately involved, evaluation findings are likely to be ignored, criticized, or resisted. People who are included in the process are more likely to feel a good deal of ownership for the evaluation plan and results. They will probably want to develop it, defend it, and make sure that the evaluation really works.

Therefore, any serious effort to evaluate a program must consider the viewpoints of the partners who will be involved in planning and delivering activities, your target audience(s), and the primary users of the evaluation data.

Engaging stakeholders who represent and reflect the populations you hope to reach greatly increases the chance that evaluation efforts will be successful. Stakeholder involvement helps to ensure that the evaluation design, including the methods and instruments used, is consistent with the cultural norms of the people you serve. Stakeholders can also influence how or even whether evaluation results are used.

All partners in your substance misuse and abuse prevention or reduction efforts should be involved in developing and implementing your evaluation plan. To facilitate this process, you may consider forming a committee focused on evaluation. The committee would work in collaboration with an evaluator to collect the data, analyze results, and share findings with partners, the community, the media, and others. Having more people trained in data collection and analysis and able to spread the word about the group’s successes contributes to sustainability.

A strong evaluation system can provide monthly data about activities and accomplishments that can be used for planning and better coordination among partners. In addition, sharing evaluation data can give the group a needed boost during the long process of facilitating changes in community programs, policies, or practices.

Cultural Competence in Evaluation

Culture can influence many elements of the evaluation process, including data collection, implementation of the evaluation plan, and interpretation of results. Tools used to collect data (such as surveys and interviews) need to be sensitive to differences in culture—in terms of both the language used and the concepts being measured.

When selecting evaluation methods and designing evaluation instruments, you should keep in mind the cultural contexts of the communities in which the intervention will be conducted. Here are some guiding questions to consider:

  • Are data collection methods relevant and culturally sensitive to the population being evaluated?
  • Have you considered how different methods may or may not work in various cultures?
  • Have you explored how different groups prefer to share information (for example, orally, in writing, one on one, in groups, through the arts)?
  • Do the instruments consider potential language barriers that may inhibit some people from understanding the evaluation questions?
  • Do the instruments consider the cultural context of the respondents?

Evaluation and Sustainability

Evaluation plays a central role in sustaining your group’s work. It enables you to take key pieces of data and analyze and organize them so that you have accurate, usable information. This process facilitates the development of the best plan possible for the community and allows your group to accurately share its story and results with key stakeholders. It can also help you track and understand community trends that may have an impact on your group’s ability to sustain its work.

A good evaluation monitors progress and provides regular feedback so that your strategic plan can be adjusted and improved. Your group may implement a variety of activities aimed at changing community systems and environments. By tracking information related to these activities and their effectiveness, as well as stakeholder feedback, community changes, and substance misuse and abuse outcomes, you can build a regular feedback loop for monitoring your progress and results. With this information, you can quickly see which strategies and activities have a greater impact than others, determine areas of overlap, and find ways to improve your group’s functioning. Using information from the evaluation, your group can adjust its strategic plan and continually improve its ability not only to sustain its work, but also to achieve community-wide reductions in substance misuse and abuse and its consequences.

STEP 5: Evaluation comprises the following primary tasks:

TASK 1: Conduct Process Evaluation

Your evaluation plan should address questions related to both process (i.e., program operations, implementation, and service delivery) and outcomes (the ultimate impact of your intervention).

A process evaluation monitors and measures your activities and operations. It addresses such issues as consistency between your activities and goals, whether activities reached the appropriate target audience(s), the effectiveness of your management, use of program resources, and how your group functioned.

Process evaluation questions may include the following:

  • Were you able to involve the members and sectors of the community that you intended to involve at each step of the way? In what ways were they involved?
  • Did you conduct an assessment of the situation in the way you planned? Did it give you the information you needed?
  • How successful was your group in selecting and implementing appropriate strategies? Were these the “right” strategies, given the intervening variables you identified?
  • Were staff and/or volunteers the right people for the jobs, and were they oriented and trained before they started?
  • Was your outreach successful in engaging those from the groups you intended to engage? Were you able to recruit the number and type of participants needed?
  • Did you structure the program as planned? Did you use the methods you intended? Did you arrange the amount and intensity of services, other activities, or conditions as intended?
  • Did you conduct the evaluation as planned?
  • Did you complete or start each element in the time you planned for it? Did you complete key milestones or accomplishments as planned?
TASK 2: Conduct Outcome Evaluation

Your evaluation plan should address questions related to both process (i.e., program operations, implementation, and service delivery) and outcomes (the ultimate impact of your intervention).

An outcome evaluation looks at the intervention’s effect on the environmental conditions, events, or behaviors it aimed to change (whether to increase, decrease, or sustain). Usually, an intervention seeks to influence one or more particular behaviors or conditions (such as risk or protective factors), assuming that this will then lead to a longer-term change, such as a decrease in the use of a particular drug among youth.

You may have followed your plan completely and still had no impact on the conditions you were targeting, or you may have ended up making multiple changes and still reached your desired outcomes. The process evaluation will tell how closely your plan was followed, and the outcome evaluation will show whether your strategy made the changes or results you had intended.

At a minimum, your community should strive to put measures in place that allow you to track the problem of interest over time.

  • Example: Comparing the percentage of high school students in the community who report past 30-day misuse of pain relievers prior to the implementation of any strategies (baseline) and again at the end of the project (and ideally at multiple points during the project) will help you identify whether the issue is getting better, getting worse, or remaining the same over time.

Other strategies to enhance the quality of your evaluation:

  • Measure changes in the intervening variables over time—this will help demonstrate if any changes in the long-term outcomes are related to the intervening variables targeted by your strategies
  • Measure changes in the short-term outcomes that are the expected antecedents of changes in your intervening variables—this will help you determine whether your strategies are having their desired effect
  • Examine whether there is a dose-response relationship between your short-, intermediate-, and long-term measures and variations in the amount (dose) of prevention services received by different individuals
  • Compare differences in short-, intermediate-, and long-term measures between individuals who were exposed to the intervention and those who were not

 

More in-depth exploration of the concepts related to process and outcome evaluation can be found here

TASK 3: Recommend Improvements and Make Mid-Course Corrections

If the intervention produced the outcomes you intended, then it achieved its goals. However, it is still important to consider how you could make the intervention even better and more effective. For instance:

  • Can you expand or strengthen parts of the intervention that worked particularly well?
  • Are there evidence-based methods or best practices out there that could make your work even more effective?
  • Would targeting more or different behaviors or intervening variables lead to greater success?
  • How can you reach people who dropped out early or who didn’t really benefit from your work?
  • How can you improve your outreach? Are there marginalized or other groups you are not reaching?
  • Can you add services—either directly aimed at intervention outcomes, or related services such as transportation—that would improve results for participants?
  • Can you improve the efficiency of your process, saving time and/or money without compromising your effectiveness or sacrificing important elements of your intervention?

Good interventions are dynamic; they keep changing and experimenting, always reaching for something better.

TASK 4: Report Evaluation Results

Sharing your evaluation results can stimulate support from funders, community leaders, and others in the community. The best way to ensure the use of your data is to communicate your findings in ways that meet the needs of your various stakeholders. Consider the following:

  • Presentation. Think about how your findings are reported, including layout, readability, and user-friendliness, and who will present the information.
  • Timing. If a report is needed for the legislative session but is not ready in time, the chances of the data being used drop dramatically.
  • Relevance. If the evaluation design is logically linked to the purpose and outcomes of the project, the findings are far more likely to be put to use.
  • Quality. This will influence whether your findings are taken seriously.
  • Post-Evaluation Technical Assistance. Questions of interpretation will arise over time, and people will be more likely to use the results if they can get their questions answered after the findings have been reported.

Evaluations are always read within a particular political context or climate. Some evaluation results will be used because of political support, while others may not be widely promoted due to political pressure. Other factors, such as the size of your organization or program, may matter as well. Sometimes larger programs get more press; sometimes targeted programs do.

It is also important to consider competing information: Do results from similar programs confirm or conflict with your results? What other topics may be competing for attention?

It is helpful to develop a plan for disseminating your evaluation findings, taking these types of questions into consideration.