Updated June 15, 2020
What should be included in an ATE proposal’s evaluation plan? What do reviewers want to see? While both of these are important questions, it is also important to make sure you have an evaluation plan in place from the beginning that will provide good information about how well your project is going and what impact you are having. Based on my experiences as a reviewer and participant in many panels, I recommend you consider these things when preparing the evaluation section of your ATE proposal. A good evaluation plan almost always raises the proposal ratings.
10 Helpful Hints
- Identify an evaluator in advance and include that person’s name and qualifications in the proposal. If possible, the evaluator should have experience and expertise in evaluating programs at two-year colleges and/or workforce-related projects. A few sentences should be included in the proposal itself about the evaluator’s expertise. For ATE, a two-page vita should be included in the supplementary documents section and tailored, using the required NSF biographical sketch format, to demonstrate the evaluator’s qualifications to evaluate this particular project. (See NSF PAPPG 20-1 and EvaluATE’s biosketch template https://www.evalu-ate.org/resources/biosketch/ for guidance). If your institutional policy does not allow you to identify an evaluator prior to funding, it is still important to note the skills and qualifications you will be seeking in an evaluator in the evaluation section of the proposal. The evaluator should not be a co-principal investigator and should have independence from the project. For smaller projects, the evaluator could be in the institutional research office or other department at the institution, but the evaluator for larger projects should be external to the institution. (Note: Many other NSF programs do not allow supplementary documents, have their own supplementary documents requirements, or follow the PAPPG exactly. Check the program solicitation to see if you can or should include a vitae of the evaluator. In any case, you will want to discuss the evaluator’s expertise in the evaluation section if you cannot include their vitae.)
2. Work with your evaluator to carefully match the evaluation plan with the project goals, objectives, outcomes, and activities. Effective evaluation starts with carefully defined project goals and expected outcomes. Some people include a summary evaluation table that presents the goals and objectives with corresponding evaluation measures for each objective.
a. Outcomes (deliverables) should be stated in measurable terms. For example, if the goal of the project is to produce more biotechnology graduates with the skills needed by industry, say that the project will “increase the number of students who graduate with an associate degree in biotechnology from the current 10 per year to 30 or more per year.” That statement is much better than the vague, unmeasurable alternative, “to increase the number of students who graduate with an associate degree in biotechnology.” Similarly, the proposal should state how it will determine the skills needed by industry and how it will evaluate whether the graduates indeed have these skills. Be sure to include data to provide a baseline for measurement.
b. Expected outcomes should identify specific observable results for each goal. For example, how will achieving your goal of developing a new biotechnology course be reflected in student learning? In later job productivity?
c. The development of measurable outcomes may take several iterations with your evaluator, but these should result in the more focused and stronger project.
3. Design the evaluation to provide evidence about what is working and where adjustments and improvements are needed. How will the evaluation be used to guide the project? Information provided by the evaluation must be both useful and important. Relative to the goals and objectives of the project, how will the project’s staff know what is working and why¾and what is not working and why? How will you use evaluation data to determine which project activities should be sustained? How will you use evaluation data for subsequent proposals?
4. Remember that, while accountability is important, evaluation of impact and effectiveness is vital. Of course, the project evaluation should determine whether the activities of the grant are being conducted in a timely manner. For example, if professional development workshops are being provided for 20 community college faculty in three consecutive years, then information should be collected about how many faculty attended and how many students were taught by these faculty. But the real question is whether faculty implemented what they learned in their classrooms and what impact this had on the preparation of technicians for the workplace. If teachers did not use the information in their classes, why not? Did they not learn the material? Was the material not appropriate for their classes? Did they not have the needed equipment? Were administrators not supportive? If faculty did implement the material in their classes, did their students learn skills valued by employers?
5. Evaluate both short- and long-term goals, develop indicators to measure progress, and create timelines. A proposal may state the long-term goal as: “to prepare technicians who are hired by companies in the region and are productive and highly valued.” Related short-term goals may be to attract more high school students to the community college technical programs, to have more students who are at the college in general education programs pursue technical degrees or take technical courses, or to place more students in relevant internships. An indicator that “more students will graduate from a technical program” may be that “more students enroll in the beginning courses and are retained from semester to semester.” The proposal should include some information about when different components of your project will be evaluated. For example, your evaluator may recommend that workshop participants answer a short survey immediately following the event and then collect data 6 months to a year later about the use of the material in classes, effectiveness of the materials with students, and possible obstacles faced.
6. Develop the evaluation plan jointly with the evaluator(s). You know the project, but the evaluator provides evaluation expertise and an outside perspective. Evaluation should be a team effort. For example, you know what is important for you and various stakeholders in the project. Your evaluator should then develop ways to determine what is being accomplished and the impact the project is having. Your evaluator can help you translate your goals and objectives into measurable outcomes. The evaluator should have enough independence to let you know what is working, what is not working, and why, but in most cases the evaluator works for you and the evaluation must be valuable for you and the project.
7. Assign responsibilities for various components of the evaluation. Where and how will you get the data, and from whom? It is important to have common goals, to work together, and to be able to count on others to provide the data when needed and in a useful format. For example:
a. You as the principal investigator, project director, or some other member of the project team may collect data and provide it to the evaluator. You may record how many people attend workshops and whether participants like these workshops. You may distribute surveys to be sent to the evaluator after they are completed by respondents and/or recommend items to be included in the surveys.
b. If you have partners at other community colleges, then you (perhaps with your evaluator) may develop a template for reporting the data. You may decide whether to collect this data in hard copy or via a website or email. You may provide the data in raw form to the evaluator, or you may do the initial analysis and synthesis.
c. The institutional research office at your college (and perhaps also those at partner colleges) may collect data on students and provide it to you in summary form (e.g., how many students enroll in certain classes, how many pass the course, what are their average grades, how many enroll in subsequent courses). For privacy reasons, these offices may be the only ones who can collect this data.
d. Your evaluator may conduct focus groups, survey participants, or collect data directly.
8. Draw from established evaluation practices to create an evaluation plan. Your plan should Include evaluation references and, if appropriate, information about your instruments. Attend workshops, webinars, and presentations on evaluation, and talk to others about evaluation instruments they use. Utilize the many resources of EvaluATE (https://evalu-ate.org/) , the evaluation support center for the National Science Foundation’s Advanced Technological Education program. EvaluATE provides webinars, resource materials, newsletters, workshops, and opportunities for ATE community members to engage around issues related to evaluation in the pursuit of excellence in technical education. Other references often quoted in ATE materials include:
- NSF’s User-Friendly Handbook for Project Evaluation
- Online Evaluation Resource Library (OERL)
- Field-Tested Learning Assessment Guide (FLAG)
- Student Assessment of Their Learning Gains (SALG)
The National Research Council’s 2003 book Evaluating and Improving Undergraduate Teaching in STEM
- ATE Proposal Evaluation Plan Template
- Reaching Students: What Research Says About Effective Instruction in Undergraduate Science and Engineering
- Evaluation Basics for Non-Evaluators
9. Develop indicators for project goals and objectives with your evaluation stakeholders in mind (e.g., project personnel, administrators at the college, faculty, industry, NSF, and others). While your project funder (such as NSF) is one stakeholder, there are many others. For example, if you want the courses developed by your project to continue after funding ends, your administrators may want to know how many students enrolled in these courses and passed. Will there be enough students in the program to support its continuation? Industry may want to know if the curricular materials reflect current practices and if students have the professional or soft skills needed as well as relevant experiences.
10. Use at least 1 (up to 2.5) of the 15 proposal pages to explain the evaluation. Write the evaluation plan in plain English. Most evaluation sections in proposals are too short. Having a detailed, well-written evaluation plan demonstrates not only that you want the money, but that you are serious about wanting to know what works and why.
10 Fatal Flaws
While the above recommendations will help you write better evaluation sections, many proposals often have fatal or near-fatal flaws. Among these are:
The evaluation section …
1.Is missing or appears to be an afterthought. While this is increasingly rare, many proposals fail to include an evaluation section. Or they may include only a few sentences about evaluation in a section that combines evaluation and dissemination near the of the proposal after most of the allotted 15 pages have been used.
2. States, “After we get the funding, we will develop an evaluation plan,” or says that the evaluation will be developed using NSF’s User-Friendly Guide to Project Evaluation. This boilerplate evaluation information essentially tells reviewers that you haven’t thought much about evaluation and will just wait and see if you are funded before you develop a plan.
3. Only evaluates easy things (e.g., attitudes). For example, while knowing that people enjoyed your workshops is nice, it is more important to know whether they actually learned anything and ultimately used it in their own classrooms.
4. Has an unreasonable or unrealistic budget (e.g., a complex plan with a tiny budget or vice versa) and fails to explain how costs were estimated. Even though final budget negotiations are between the NSF program officer and principal investigator, reviewers do look at the evaluation budget. Most proposals include very little information about how the budget for evaluation was determined or what the evaluation dollars buy. The budget may include a large amount (e.g., 12 percent or $12,000 per year on a $100,000-per-year project) with little or no information about how that was determined or how it will be spent. Or the proposal may include a very complex budget plan but allot a small budget amount (e.g., 2 percent or $2,000 per year on a $100,000-per-year budget) to the evaluation. Remember that you should state in the budget justification who will do the evaluation and the anticipated time and rate for this person. The time and dollars allotted must be reasonable and justifiable. (e.g., 10 days per year at $xxx per day, which includes 3 days working on campus with the project team, 4 days for data gathering, 2 days of analysis, and 1 day of report writing.) The daily or hourly rate should reflect what the person normally earns and cannot be inflated for NSF proposals.
5. Does not align with the priorities of the funding program. When preparing both the proposal and the evaluation plan, read the program announcement several times and make sure that what you are proposing (and then evaluating) aligns with the goals of the funding program. Every program will have its own priorities and clearly state them in the program summary. All NSF programs also have a section near the end of the program solicitation that describes the merit review criteria against which the proposal will be reviewed. For example, the description of the ATE program on the NSF website states, “With an emphasis on two-year colleges, the Advanced Technological Education (ATE) program focuses on the education of technicians for the high-technology fields that drive our nation’s economy. The program involves partnerships between academic institutions and employers to promote improvement in the education of science and engineering technicians at the undergraduate and secondary school levels” (http://www.nsf.gov/ate). The section in the ATE program solicitation on the merit review criteria includes such questions as: Does the project have potential for improving student learning in science or engineering technician education programs? Has an assessment of workforce needs for technicians been conducted? Does the project work with employers to address their current and future needs for technicians?
6. States PIs will do all the evaluation. While the principal investigators and project staff should be involved in the development of the evaluation plan and even collect some of the information and data, the evaluation should be conducted by people external to the project itself.
7. Is too short and lacking in details. The evaluation plan should include enough specifics for the reviewers to feel confident that it is carefully developed and will provide relevant information during the project about what aspects of the project are working well and where improvements are needed (often called formative evaluation) and at the end of the project about its impact and effectiveness (often called summative evaluation).
8. Was cut-and-pasted from another proposal with few changes. Evaluation sections are often generic and appear to be boilerplate statements that were prepared by the evaluator with little relevance to the particular proposal. While some components of an evaluation may be the same for different proposals, the evaluation plan must be germane to the proposal being submitted. Too often evaluation plans have little to do with the goals and objectives of the project or were written for other projects. (Note: Your proposal may end up on a panel where several proposals have the same evaluator. If the proposal sections are identical or close to identical, reviewers will notice. Also, ATE may use reviewers who have received funding. Reviewers are not happy to see their own evaluation plan show up in a proposal verbatim.)
9. Uses too much jargon for reviewers to easily read or understand; is too complex. Most ATE proposals are reviewed by people with STEM content knowledge, not experts in evaluation.
10. States that the evaluation will be done using “name your favorite evaluation method,” but fails to explain this method or why it is appropriate. While evaluations should build on what others have done, proposers should not assume that reviewers will be familiar with all the evaluation literature. If the proposal designates a particular instrument or evaluation method, a few sentences explaining the method and why it was chosen are needed. If surveys and other instruments are to be used, reviewers will want some assurance that these instruments are valid and reliable and that project personnel have discussed with the evaluator why and how these will be used.
It is hoped that these suggestions may prove useful as you prepare your proposals and that you will contribute to the revision of this article. If you have a helpful hint or a fatal flaw that you would like to share with the author, please send it to firstname.lastname@example.org.