This module helps executive leadership and program directors work through an evaluation plan, identify appropriate overarching evaluation questions, and understand what types of external evaluations are appropriate at what stages.

Careful consideration of this module helps you spend your organization’s resources wisely—and get meaningful evaluation results—by engaging external evaluators for the right types of evaluation at the right times.

step 1: Learn

User Guide

Introduction to the Performance Practice, acknowledgments, application, and development methodology.

Download User Guide

step 2: Collect Data

Worksheet

Use this worksheet to complete the self-assessment

Download Worksheet

step 3: Review Results

Reporting Tool

The reporting app compiles your results - no manual aggregation required!

Reporting App for Windows Reporting App For MacOS
v10.13 (High Sierra) and above

Evaluation Principles and Proof Points

Expand All

Principle 7.1: Leaders complement internal monitoring with external evaluations conducted by highly skilled, independent experts.

7.1.1: To help drive improvements in our organization, we periodically arrange for external evaluations conducted by experts with credibility in the field.

Principle 7.2: Leaders commission external assessments to learn more about how well their programs are being run, what these programs are or are not accomplishing, who is or is not benefiting, and how the programs can be strengthened. Leaders do not use external assessments as a one-time, up-or-down verdict on the organization’s effectiveness.

7.2.1: My organization’s external evaluations are designed to assess the reliability and validity of our internal performance data; the quality of our implementation; and the overall effectiveness of our efforts.

Principle 7.3: Leaders recognize that there are many different types of external assessments, and no one type is right for every organization or for every stage of an organization’s development. Independent evaluators who understand how different methodologies fit different contexts can help leaders match the tool to the task.

7.3.1: My organization has adopted a formal external evaluation plan that spells out the different types of evaluations that will be relevant for us at different stages of our development. We update the plan periodically.

7.3.2: My organization’s plan includes formative (implementation) evaluation to help us determine:

  • the quality of our internal data and program implementation
  • whether we are delivering programs with fidelity to our model
  • how well we are doing at recruiting and enrolling the population for which our programs are designed
  • our program utilization, program completion, and participant engagement
  • which clients achieve the intended outcomes, which do not, and which exit the program prematurely.

7.3.3: My organization’s evaluation plan includes summative (impact) evaluation of programs that have been running as intended for several years, to help us determine whether we’re making a difference beyond what would have happened anyway.

Principle 7.4: Leaders draw a clear distinction between outputs (e.g., meals delivered, youth tutored) and outcomes (meaningful changes in knowledge, skills, behavior, or status). Those who are working to improve outcomes commission evaluations to assess whether they are having a positive net impact. In other words, they want to know to what extent, and for whom, they’re making a meaningful difference beyond what would have happened anyway.

7.4.1: My organization’s internal performance data clearly distinguish between outputs and outcomes—and have been validated by independent experts.

7.4.2: My organization’s external evaluators use output data to help us learn about program quality and fidelity.

7.4.3: My organization’s external evaluators use outcome data to help us determine whether we’re making a difference beyond what would have happened anyway. This requires using a reliable research design to compare data from our participants with data from similar people who did not receive our services.

Principle 7.5: Leaders who plan to expand significantly any programs aimed at improving outcomes have a special obligation to commission a rigorous evaluation that can assess net impact.

7.5.1: If my organization plans to grow significantly, we are conducting (or have conducted) both rigorous formative (implementation) and summative (impact) evaluations—with enough lead time to allow us to make critical adjustments and ensure that expanded programs will have the best chance of achieving net impact for those we serve.

7.5.2: My organization has or would put growth plans on hold—and look to redesign them before resuming growth—if/when evaluation findings show that we’re having significant trouble with implementation or our clients are not benefiting in the ways we had expected.

Principle 7.6: Even those leaders who commission the most rigorous of impact evaluations do not stop there. They commission additional assessments to gauge their impact in new settings (or for new populations) and achieve greater positive impact for the money they spend.

7.6.1: My organization conducts new external evaluations (formative or summative) whenever we make significant program changes, operate programs in new contexts, and/or enroll different target populations.

7.6.2: My organization periodically conducts new summative evaluations, because the societal context in which our organization and programs operate constantly changes.

Principle 7.7: Leaders share the methodology and results of their external assessments to help others learn and avoid mistakes.

7.7.1: My organization shares our evaluation plans throughout the organization and with interested stakeholders.

7.7.2: My organization shares our evaluation findings throughout the organization as the basis for strengthening our programs and with external stakeholders who can benefit from the knowledge.

Details:

The Summary report shows the distribution of ratings for each proof point, gives a picture of the level of consensus, and opens the door to productive conversations about ways to move forward.

Details:

For a thorough understanding of individual perspectives, drill down to see each respondent’s ratings and comments per proof point.

? What is this?

Learn about how this module fits into the Performance Practice

Jump to a practice module

Performance Practice

Download Link
PDF

Module 7: Evaluation Principles and Proof Points

Download Link
PDF

Module 7: Evaluation Worksheet

Download Link
PDF

Performance Practice User Guide

Download Link
PDF

Performance Practice All Modules

Continuity and Change

The Vital Importance of Challenging Assumptions

Drucker on Continuity and Change

The Leap of Reason Ambassadors Community is a private community of experts and leaders who believe that mission and performance are inextricably linked. Our resources are collaboratively developed and offered to the field to support organizations on their journeys to high performance.

Follow Us

© Copyright 2021 | Leap of Reason Ambassadors Community. All Rights Reserved.