Project evaluators are aware that evaluation aims to support learning and improvement. Through a series of planned interactions, event observations, and document reviews, the evaluator is charged with reporting to the project leadership team and ultimately the project’s funding agency, informing audiences of the project’s merit. This is not to suggest that reporting should only aim to identify positive impacts and outcomes of the project. Equally, there is substantive value in informing audiences of unintended and unattained project outcomes.

Evaluation reporting should discuss aspects of the project’s outcomes, whether anticipated, questionable, or unintended. When examining project outcomes the evaluator analyzes obtained information and facilitates project leadership through reflective thinking exercises for the purpose of defining the significance of the project and summarizing why outcomes matter.

Let’s be clear, outcomes are not to be regarded as something negative. In fact, with the projects that I have evaluated over the years, outcomes have frequently served as an introspective platform informing future curriculum decisions and directions internal to the institutional funding recipient. For example, the outcomes of one STEM project that focused on renewable energy technicians provided the institution with information that prompted the development of subsequent proposals and projects targeting engineering pathways.

Discussion and reporting of project outcomes also encapsulates lessons learned and affords the opportunity for the evaluator to ask questions such as:

  • Did the project increase the presence of the target group in identified STEM programs?
  • What initiatives will be sustained during post funding to maintain an increased presence of the target group in STEM programs?
  • Did project activities contribute to the retention/completion rates of the target group in identified STEM programs?
  • Which activities seemed to have the greatest/least impact on retention/completion rates?
  • On reflection, are there activities that could have more significantly contributed to retention/completion rates that were not implemented as part of the project?
  • To what extent did the project supply regional industries with a more diverse STEM workforce?
  • What effect will this have on regional industries during post project funding?
  • Were partners identified in the proposal realistic contributors to the funded project? Did they ensure a successful implementation enabling the attainment of anticipated outcomes?
  • What was learned about the characteristics of “good” and “bad” partners?
  • What are characteristics to look for and avoid to maximize productivity with future work?

Factors influencing outcomes include, but are not limited to:

  • Institutional changes, e.g., leadership;
  • Partner constraints or changes; and
  • Project/budgetary limitations.

In some instances, it is not unusual for the proposed project to be somewhat grandiose in identifying intended outcomes. Yet, when project implementation gets underway, intended activities may be compromised by external challenges. For example, when equipment is needed to support various aspects of a project, procurement and production channels may contribute to delays in equipment acquisition, thus adversely effecting project leadership’s ability to launch planned components of the project.

As a tip, it is worthwhile for those seeking funding to pose the outcome questions at the front-end of the project – when the proposal is being developed. Doing this will assist them in conceptualizing the intellectual merit and impact of the proposed project.

Resources and Links:

Developing an Effective Evaluation Report: Setting the Course for Effective Program Evaluation. Atlanta, Georgia: Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health, Division of Nutrition, Physical Activity and Obesity, 2013.

About the Authors

Dr. Diana Pollard McCauley Williams

Dr. Diana Pollard McCauley Williams box with arrow

Education Administrator, Independent

Diana received her Bachelors Degree in Elementary Education with a minor in Mathematics from Cheyney University; Masters Degrees in Mathematics Education (Temple University) and Library Science (Villanova University); and the Doctorate in Higher Education Administration (Temple University). Her professional experiences include teaching, administration, educational sales, consulting, and grant evaluations. For almost two decades, she has served as a grant evaluator of projects funded by the US Departments of Education and Labor, the National Science Foundation, and private organizations. Diana has long been on the front line of educational, political, and youth-focused projects, garnering recognition for her service from numerous entities.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.