If a prospective client says, “We need an evaluation, and we will send you the dataset for evaluation,” our advice is that this type of “drive-by evaluation” may not be in their best interest.

As calls for program accountability and data-driven decision making increase, so does demand for evaluation. Given this context, evaluation services are being offered in a variety of modes. Before choosing an evaluator, we recommend the client pause to consider what they would like to learn about their efforts and how evaluation can add value to such learning. This perspective requires one to move beyond data analysis and reporting of required performance measures to examining what is occurring inside the program.

By engaging our clients in conversations related to what they would like to learn, we are able to begin a collaborative and discovery-oriented evaluation. Our goal is to partner with our clients to identify and understand strengths, challenges, and emerging opportunities related to program/project implementation and outcomes. This process will help clients not only understand which strategies worked, but why they worked and lays the foundation for sustainability and scaling.

These initial conversations can be a bit of a dance, as clients often focus on funder-required accountability and performance measures. This is when it is critically important to elucidate the differences between evaluation and auditing or inspecting. Ann-Murray Brown examines this question and provides guidance as to why evaluation is more than just keeping score in Evaluation, Inspection, Audit: Is There a Difference? As we often remind clients, “we are not the evaluation police.”

During our work with clients to clarify logic models, we encourage them to think of their logic model in terms of storytelling. We pose commonsense questions such as: When you implement a certain strategy, what changes to you expect to occur? Why do you think those changes will take place? What do you need to learn to support current and future strategy development?

Once our client has clearly outlined their “story,” we move quickly to connect data collection to client-identified questions and, as soon as possible, we engage stakeholders in interpreting and using their data. We incorporate Veena Pankaj and Ann Emery’s (2016) data placemat process to engage clients in data interpretation.  By working with clients to fully understand their key project questions, focus on what they want to learn, and engage in meaningful data interpretation, we steer clear of the potholes associated with drive-by evaluations.

Pankaj, V. & Emery, A. (2016). Data placemats: A facilitative technique designed to enhance stakeholder understanding of data. In R. S. Fierro, A. Schwartz, & D. H. Smart (Eds.), Evaluation and Facilitation. New Directions for Evaluation, 149, 81-93.

About the Authors

John Cosgrove

John Cosgrove box with arrow

Senior Partner, Cosgrove & Associates

John Cosgrove is a senior partner with Cosgrove & Associates. Mr. Cosgrove has extensive evaluation experience, and Cosgrove & Associates is currently leading the evaluations for a number of external grants, including efforts with the Department of Labor, National Science Foundation, Department of Education, and Department of Agriculture. In addition, Mr. Cosgrove’s emphasis on collaborative evaluation helps breathe life into evaluation processes and helps ensure faculty and staff have the information they need to explore promising practices and use lessons-learned for continuous improvement. Specific areas of expertise include developmental and utilization-focused evaluation, institutional research and strategic planning, development of user-friendly decision-support data systems, return on investment analysis, career pathway development, and experiential and active learning instructional modes. Mr. Cosgrove is committed to social justice and efforts to enhance equity in student outcomes for all students.

Maggie Cosgrove

Maggie Cosgrove box with arrow

Senior Partner, Cosgrove & Associates

Maggie Cosgrove is a senior partner with Cosgrove & Associates. Ms. Cosgrove has deep experience in evaluation methods, policy analysis, grant management, and community college impact. Her experience includes evaluating Department of Labor, National Science Foundation, Department of Education, and Department of Agriculture grants. Ms. Cosgrove has a proven track record of providing excellent training and customer service to college partners. Specific areas of expertise include developmental and utilization-focused evaluation, developmental education redesign, development of career pathways, experiential and active learning instructional modalities, return on investment analysis, and employer and community stakeholder engagement. Ms. Cosgrove is committed to social justice and efforts to enhance equity in student outcomes for all students.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.