An evaluation plan should include a clear description of what data will be collected, from what sources and how, by whom, and when, as well as how the data will be analyzed. Placing this information in a matrix helps ensure that there is a viable plan for collecting all the data necessary to answer each evaluation question and that all collected data will serve a specific, intended purpose. The table below may be copied into another document, such as a grant proposal, and edited/ expanded as needed. An example is provided on the next page.
An example evaluation timeline across 3 years for an ATE project. This example is an excerpt from the Evaluation Basics for Non-evaluators webinar. Access slides, recording, handout, and additional resources from bit.ly/mar18-webinar.
Highlights the four main steps of an ATE Evaluation, and provides detailed activities for each step. This example is an excerpt from the Evaluation Basics for Non-evaluators webinar. Access slides, recording, handout, and additional resources from bit.ly/mar18-webinar.
Creating a clear communication plan at the beginning of an evaluation can help project personnel and evaluators avoid confusion, misunderstandings, or uncertainty. The communication plan should be an agreement between the project’s principal investigator and the evaluator, and followed by members of their respective teams. This checklist highlights the decisions that need to made when developing a clear communication plan.
Give your proposal a competitive edge with a strong evaluation plan. The National Science Foundation has issued a new solicitation for its Advanced Technological Education (ATE) program. It includes major changes to the guidelines for ATE evaluation plans. Attend this webinar to learn the key elements of a winning evaluation plan and strategies for demonstrating to reviewers that evaluation is an integral part of your project, not an afterthought. In addition, we’ll provide you with specific guidance for how to budget for an evaluation, locate a qualified evaluator, and describe results from prior NSF support with supporting evaluative evidence. You will receive an updated and other tools to help prepare strong evaluation plans.
ATE Proposal Evaluation Plan Template
Data Collection Planning Matrix
Evaluator Biographical Sketch Template for National Science Foundation (NSF) Proposals
Evaluation Planning Checklist for ATE Proposals
Evaluation Questions Checklist for Program Evaluation
Guide to Finding and Selecting an Evaluator
Logic Models: Getting them Right and Using them Well [webinar]
Logic Model Template for ATE Projects and Centers
NSF Prior Support Checklist
Small-Scale Evaluation Webinar
Director of Evaluation and Assessment, NC State Industry Expansion Solutions
Assistant Director of Research Development and Evaluation, NC State Industry Expansion Solutions
Designing a rigorous and informative evaluation depends on communication with program staff to understand planned activities and how those activities relate to the program sponsor’s objectives and the evaluation questions that reflect those objectives (see white paper related to communication). At NC State Industry Expansion Solutions, we have worked long enough on evaluation projects to know that such communication is not always easy because program staff and the program sponsor often look at the program from two different perspectives: The program staff focus on work plan activities (WPAs), while the program sponsor may be more focused on the evaluation questions (EQs). So, to help facilitate communication at the beginning of the evaluation project and assist in the design and implementation, we developed a simple matrix technique to link the WPAs and the EQs (see below).
For each of the WPAs, we link one or more EQs and indicate what types of data collection events will take place during the evaluation. During project planning and management, the crosswalk of WPAs and EQs will be used to plan out qualitative and quantitative data collection events.
The above framework may be more helpful with the formative assessment (process questions and activities). However, it can also enrich the knowledge gained by the participant outcomes analysis in the summative evaluation in the following ways:
Understanding how the program has been implemented will help determine fidelity to the program as planned, which will help determine the degree to which participant outcomes can be attributed to the program design.
Details on program implementation that are gathered during the formative assessment, when combined with evaluation of participant outcomes, can suggest hypotheses regarding factors that would lead to program success (positive participant outcomes) if the program is continued or replicated.
Details regarding the data collection process that are gathered during the formative assessment will help assess the quality and limitations of the participant outcome data, and the reliability of any conclusions based on that data.
So, for us this matrix approach is a quality-check on our evaluation design that also helps during implementation. Maybe you will find it helpful, too.
This template is for use in preparing the evaluation plan sections for proposals to the National Science Foundation’s Advanced Technological Education (ATE) program. It is based the ATE Evaluation Planning Checklist, also developed by EvaluATE. It is aligned with the evaluation guidance included in the 2017 ATE Program Solicitation. All proposers should read the solicitation in full.
Updated July 2017!
This checklist is intended to be of assistance to grant writers, project leaders, and evaluators as they develop evaluation plans for proposals to the National Science Foundation’s Advanced Technological Education (ATE) program. It is organized around the components of an NSF proposal (see the NSF Grant Proposal Guide), with an emphasis on the evaluation elements that are needed in several locations throughout a grant proposal. This document is not intended to serve as a comprehensive checklist for preparing an ATE proposal. Rather, it includes guidance for aspects of a proposal that pertain to evaluation. All proposers should carefully read the 2017 ATE Program Solicitation.
This video was created for the 2017 Mentor-Connect Cohort, but can be applicable to others interested in learning about ATE Evaluation. Specifically this video provides an overview of What is project evaluation?, Why does NSF require evaluation?, How do you plan for evaluation?, and How can EvaluATE help?
Northeast Wisconsin Technical College’s (NWTC) Grants Office works closely with its Institutional Research Office to create ad hoc evaluation teams in order to meet the standards of evidence required in funders’ calls for proposals. Faculty members at two-year colleges often make up the project teams that are responsible for National Science Foundation (NSF) grant project implementation. However, they often need assistance navigating among terms and concepts that are traditionally found in scientific research and social science methodology.
Federal funding agencies are now requiring more evaluative rigor in their grant proposals than simply documenting deliverables. For example, the NSF’s Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) program saw dramatic changes in 2015: The program solicitation increased the amount of non-scholarship budget from 15% of the scholarship amount to 40% of the total project budget to increase supports for students and to investigate the effectiveness of those supports.
Technical colleges, in particular, face a unique challenge as solicitations change: These colleges traditionally have faculty members from business, health, and trades industries. Continuous improvement is a familiar concept to these professionals; however, they tend to have varying levels of expertise evaluating education interventions.
The following are a few best practices we have developed for assisting project teams in grant proposal development and project implementation at NWTC.
- Where possible, work with an external evaluator at the planning stage. External evaluators can provide the expertise that principal investigators and project teams might lack as external evaluators are well-versed on current evaluation methods, trends, and techniques.
- As they develop their projects, teams should meet with their Institutional Research Office to better understand data gathering and research capacity. Some data needed for evaluation plans might be readily available, whereas others might require some advanced planning to develop a system to track information. Conversations about what the data will be used for and what questions the team wants to answer will help ensure that the correct data are able to be gathered.
- After a grant is awarded, have a conversation early with all internal and external evaluative parties about clarifying data roles and responsibilities. Agreeing to reporting deadlines and identifying who will collect the data and conduct further analysis will help avoid delays.
- Create a “data dictionary” for more complicated projects and variables to ensure that everyone is on the same page about what terms mean. For example, “student persistence” can be defined term-to-term or year-to-year and all parties need to understand which data will need to be tracked.
With some planning and the right working relationships in place, two-year colleges can maintain their federal funding competitiveness even as agencies increase evaluation requirements.