Evaluators are urged to make their evaluations useful. Project staff are encouraged to use their evaluations. An obvious way to support these aims is for evaluators to develop recommendations based on evidence and for project staff to follow those recommendations (if they agree with them, of course). But not all reports have recommendations, and sometimes recommendations are just “keep up the good work!” If implications for actions are not immediately obvious from an evaluation report, here are three questions that project staff can ask themselves to spark thinking and decision making about how to use evaluation findings. I’ve included real-world examples based our experience at EvaluATE.
1) Are there any unexpected findings in the report? The EvaluATE team has been surprised to learn that we are attracting a large number of grant writers and other grant professionals to our webinars. We initially assumed that principal investigators (PIs) and evaluators would be our main audience. With growing attendance among grant writers, we became aware that they are often the ones who first introduce PIs to evaluation, guiding them on what should go in the evaluation section of a proposal and how to find an evaluator. The unexpected finding that grant writers are seeking out EvaluATE for guidance made us realize that we should develop more tailored content for this important audience as we work to advance evaluation in the ATE program.
Talk with your team and your evaluator to determine if any action is needed related to your unexpected results.
2) What’s the worst/least favorable evaluation finding from your evaluation? Although it can be uncomfortable to focus on a project’s weak points, doing so is where the greatest opportunity for growth and improvement lies. Consider the probable causes of the problem and potential solutions. Can you solve the problem with your current resources? If so, make an action plan. If not, decide if the problem is important enough to address through a new initiative.
At EvaluATE, we serve both evaluators and evaluation consumers who have a wide range of interests and experience. When asked what EvaluATE needs to improve, several respondents to our external evaluation survey noted that they want webinars to be more tailored to their specific needs and skill levels. Some noted that our content was too technical, while others remarked that it was too basic. To address this issue, we decided to develop an ATE evaluation competency framework. Webinars will be keyed to specific competencies, which will help our audience decide which are appropriate for them. We couldn’t implement this research and development work with our current resources, so we wrote this activity into a new proposal.
Don’t sweep an unfavorable result or criticism under the rug. Use it as a lever for positive change.
3) What’s the most favorable finding from your evaluation? Give yourself a pat on the back, and then figure out if this finding points to an aspect of your project you should expand. If you need more information to make that decision, determine what additional evidence could be obtained in the next round of the evaluation. Help others to learn from your successes—the ATE Principal Investigators Conference is an ideal place to share aspects of your work that are especially strong, along with your lessons learned and practical advice about implementing ATE projects.
At EvaluATE, we have been astounded at the interest in and positive response to our webinars. But we don’t yet have a full understanding of the extent to which webinar attendance translates to improvements in evaluation practice. So we decided to start collecting follow-up data from webinar participants to check on use of our content. With that additional evidence in hand, we’ll be better positioned to make an informed decision about expanding or modifying our webinar series.
Don’t just feel good about your positive results—use them as leverage for increased impact.
If you’ve considered your evaluation results carefully but still aren’t able to identify a call to action, it may be time to rethink your evaluation’s focus. You may need to make adjustments to ensure it produces useful, actionable information. Evaluation plans should be fluid and responsive—it is expected that plans will evolve to address emerging needs.