Archive: proposal development

Webinar: Give Your Proposal A Competitive Edge with a Great Evaluation Plan

Posted on July 17, 2018 by , in Webinars ()

Presenter(s): Lori Wingate, Michael Lesiecki
Date(s): August 22, 2018
Time: 1:00-2:00 p.m. Eastern

A strong evaluation plan will give your proposal a competitive edge. In this webinar, we’ll explain the essential elements of an effective evaluation plan and show you how to incorporate them into a proposal for the National Science Foundation’s Advanced Technological Education program. We’ll also provide guidance on how to budget for an evaluation, locate a qualified evaluator, and use evaluative evidence to describe the results from prior NSF support (required if you’ve had previous NSF funding). Participants will receive an updated Evaluation Planning Checklist for ATE Proposals and other resources to help prepare strong evaluation plans.



Blog: Evaluation Feedback Is a Gift

Posted on July 3, 2018 by  in Blog ()

Chemistry Faculty, Anoka-Ramsey Community College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Christopher Lutz, chemistry faculty at Anoka-Ramsey Community College. When our project was initially awarded, I was a first-time National Science Foundation (NSF) principal investigator. I understood external evaluation was required for grants but saw it as an administrative hurdle in the grant process. I viewed evaluation as proof for the NSF that we did the project and as a metric for outcomes. While both of these aspects are important, I learned evaluation is also an opportunity to monitor and improve your process and grant. Working with our excellent external evaluators, we built a stronger program in our grant project. You can too, if you are open to evaluation feedback.

Our evaluation team was composed of an excellent evaluator and a technical expert. I started working with both about halfway through the proposal development process (a few months before submission) to ensure they could contribute to the project. I recommend contacting evaluators during the initial stages of proposal development and checking in several times before submission. This gives adequate time for your evaluators to develop a quality evaluation plan and gives you time to understand how to incorporate your evaluator’s advice. Our funded project yielded great successes, but we could have saved time and achieved more if we had involved our evaluators earlier in the process.

After receiving funding, we convened grant personnel and evaluators for a face-to-face meeting to avoid wasted effort at the project start. Meeting in person allowed us to quickly collaborate on a deep level. For example, our project evaluator made real-time adjustments to the evaluation plan as our academic team and technical evaluator worked to plan our project videos and training tools. Include evaluator travel funds in your budget and possibly select an evaluator who is close by. We did not designate travel funds for our Kansas-based evaluator, but his ties to Minnesota and understanding of the value of face-to-face collaboration led him to use some of his evaluation salary to travel and meet with our team.

Here are three ways we used evaluation feedback to strengthen our project:

Example 1: The first-year evaluation report showed a perceived deficiency in the project’s provision of hands-on experience with MALDI-MS instrumentation. In response, we had students make small quantities of liquid solution instead of giving pre-mixed solutions, and let them analyze more lab samples. This change required minimal time but led students to regard the project’s hands-on nature as a strength in the second-year evaluation.

Example 2: Another area for improvement was students’ lack of confidence in analyzing data. In response to this feedback, project staff create Excel data analysis tools and a new training activity for students to practice with literature data prior to analyzing their own. The subsequent year’s evaluation report indicated increased student confidence.

Example 3: Input from our technical evaluator allowed us to create videos that have been used in academic institutions in at least three US states, the UK’s Open University system, and Iceland.

Provided here are some overall tips:

  1. Work with your evaluator(s) early in the proposal process to avoid wasted effort.
  2. Build in at least one face-to-face meeting with your evaluator(s).

Review evaluation data and reports with the goal of improving your project in the next year.

Consider external evaluators as critical friends who are there to help improve your project. This will help move your project forward and help you have a greater impact for all.

Template: ATE Proposal Evaluation Plan

Posted on July 13, 2017 by  in Resources ()

This template is for use in preparing the evaluation plan sections for proposals to the National Science Foundation’s Advanced Technological Education (ATE) program. It is based the ATE Evaluation Planning Checklist, also developed by EvaluATE. It is aligned with the evaluation guidance included in the 2017 ATE Program Solicitation. All proposers should read the solicitation in full.

File: Click Here
Type: Worksheet
Category: Resources
Author(s): Lori Wingate

Resource: Finding and Selecting an Evaluator for Advanced Technological Education (ATE) Proposals

Posted on July 13, 2017 by  in Resources ()

All ATE proposals are required to request “funds to support an evaluator independent of the project.” Ideally, this external evaluator should be identified in the project proposal. The information in this guide is for individuals who are able to select and work with an external evaluator at the proposal stage. However, some institutions prohibit selecting an evaluator on a noncompetitive basis in advance of an award being made. Advice for individuals in that situation is provided in an EvaluATE blog and newsletter article.

This guide includes advice on how to locate and select an external evaluator. It is not intended as a guide for developing an evaluation plan or contracting with an evaluator.

File: Click Here
Type: Doc
Category: Resources
Author(s): Lori Wingate

Checklist: Evaluation Planning for NSF-ATE Proposals

Posted on July 1, 2017 by  in Resources ()

Updated July 2017!

This checklist is intended to be of assistance to grant writers, project leaders, and evaluators as they develop evaluation plans for proposals to the National Science Foundation’s Advanced Technological Education (ATE) program. It is organized around the components of an NSF proposal (see the NSF Grant Proposal Guide), with an emphasis on the evaluation elements that are needed in several locations throughout a grant proposal. This document is not intended to serve as a comprehensive checklist for preparing an ATE proposal. Rather, it includes guidance for aspects of a proposal that pertain to evaluation. All proposers should carefully read the 2017 ATE Program Solicitation.

File: Click Here
Type: Checklist
Category: Proposal Development
Author(s): Lori Wingate

Blog: Three Tips for a Strong NSF Proposal Evaluation Plan

Posted on August 17, 2016 by  in Blog ()

Principal Research Scientist, Education Development Center, Inc.

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Leslie Goodyear and I’m an evaluator who also served as a program officer for three years at the National Science Foundation in the Division of Research on Learning, which is in the Education and Human Resources Directorate. While I was there, I oversaw evaluation activities in the Division and reviewed many, many evaluation proposals and grant proposals with evaluation sections.

In May 2016, I had the pleasure of participating in the “Meeting Requirements, Exceeding Expectations: Understanding the Role of Evaluation in Federal Grants.” Hosted by Lori Wingate at EvaluATE and Ann Beheler at the Centers Collaborative for Technical Assistance, this webinar covered topics such as evaluation fundamentals; evaluation requirements and expectations; and evaluation staffing, budgeting and utilization.

On the webinar, I shared my perspective on the role of evaluation at NSF, strengths and weaknesses of evaluation plans in proposals, and how reviewers assess Results from Prior NSF Support sections of proposals, among other topics. In this blog, I’ll give a brief overview of some important takeaways from the webinar.

First, if you’re making a proposal to education or outreach programs, you’ll likely need to include some form of project evaluation in your proposal. Be sure to read the program solicitation carefully to know what the specific requirements are for that program. There are no agency-wide evaluation requirements—instead they are specified in each solicitation. Lori had a great suggestion on the webinar:  Search the solicitation for “eval” to make sure you find all the evaluation-related details.

Second, you’ll want to make sure that your evaluation plan is tailored to your proposed activities and outcomes. NSF reviewers and program officers can smell a “cookie cutter” evaluation plan, so make sure that you’ve talked with your evaluator while developing your proposal and that they’ve had the chance to read the goals and objectives of your proposed work before drafting the plan. You want the plan to be incorporated into the proposal so that it appears seamless.

Third, indicators of a strong evaluation plan include carefully crafted, relevant overall evaluation questions, a thoughtful project logic model, a detailed data collection plan that is coordinated with project activities, and a plan for reporting and dissemination of findings. You’ll also want to include a bio for your evaluator so that the reviewers know who’s on your team and what makes them uniquely qualified to carry out the evaluation of your project.

Additions that can make your plan “pop” include:

  • A table that maps out the evaluation questions to the data collection plans. This can save space by conveying lots of information in a table instead of in narrative.
  • Combining the evaluation and project timelines so that the reviewers can see how the evaluation will be coordinated with the project and offer timely feedback.

Some programs allow for using the Supplemental Documents section for additional evaluation information. Remember that reviewers are not required to read these supplemental docs, so be sure that the important information is still in the 15-page proposal.

For the Results of Prior NSF Support section, you want to be brief and outcome-focused. Use this space to describe what resulted from the prior work, not what you did. And be sure to be clear how that work is informing the proposed work by suggesting, for example, that these outcomes set up the questions you’re pursuing in this proposal.

Newsletter: Revisiting Intellectual Merit and Broader Impact

Posted on January 1, 2016 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

If you have ever written a proposal to the National Science Foundation (NSF) or participated in a proposal review panel for NSF, you probably instantly recognize the terms Intellectual Merit and Broader Impacts as NSF’s merit review criteria. Proposals are rated and funding decisions are made based on how well they address these criteria. Therefore, proposers must describe the potential of their proposed work to advance knowledge and understanding (Intellectual Merit) and benefit society (Broader Impacts).

Like cramming for an exam and then forgetting 90 percent of what you memorized, it’s all too easy for principal investigators to lose sight of Intellectual Merit and Broader Impacts after proposal submission. But there are two important reasons to maintain focus on Intellectual Merit and Broader Impacts after an award is made and throughout project implementation.

First, the goals and activities expressed in a proposal are commitments about how a particular project will advance knowledge (Intellectual Merit) and bring tangible benefits to individuals, institutions, communities, and/or our nation (Broader Impacts). Simply put, PIs have an ethical obligation to follow through on these commitments to the best of their abilities.

Second, when funded PIs seek subsequent grants from NSF, they must describe the results of their prior NSF funding in terms of Intellectual Merit and Broader Impacts. In other words, proposers must explain how they used their NSF funding to actually advance knowledge and understanding and benefit society. PIs who have evidence of their accomplishments in these areas and can convey it succinctly will be well-positioned to seek additional funding. To ensure evidence of both Intellectual Merit and Broader Impacts are being captured, PIs should revisit project evaluation plans with their evaluators, crosschecking the proposal’s claims about potential Intellectual Merit and Broader Impacts in relation to the evaluation questions and data collection plan to make sure compelling evidence is captured.

Last October, I conducted a workshop on this topic at the ATE Principal Investigators Conference with colleague Kirk Knestis, an evaluator from Hezel Associates. Dr. Celeste Carter, ATE program co-lead, spoke about how to frame results of prior NSF support in proposals. She noted that a common misstep that she has seen in proposals is when proposers speak to results from prior support by simply reiterating what they said they were going to do in their funded proposals, rather than describing the actual outcomes of the grant. Project summaries (one-page descriptions that address a proposed project’s Intellectual Merit and Broader Impacts that are required as part of all NSF proposals) are necessarily written in a prospective, future-oriented manner because the work hasn’t been initiated yet. In contrast, the Results of Prior NSF Support sections are about completed work and therefore are written in past tense and should include evidence of accomplishments. Describing achievements and presenting evidence of the quality and impact of those achievements shows reviewers that the proposer is a responsible steward of federal funds, can deliver on promises, and is building on prior success.

Take time now, well before it is time to submit a new proposal or a Project Outcomes Report, to make sure you haven’t lost sight of the Intellectual Merit and Broader Impact aspects of your grant and how you promised to contribute to these national priorities.

Doc: HI-TEC 2015- Handout | Evaluation: Don’t Submit Your ATE Proposal Without It

Posted on November 30, 2015 by , in Conferences

A strong evaluation plan that is well integrated into your grant proposal will strengthen your submission and maybe even give you a competitive edge. In this session we’ll provide insights on ways to enhance your proposal and avoid common pitfalls with regard to evaluation. We’ll walk through EvaluATE’s Evaluation Planning Checklist for ATE Proposals, which provides detailed guidance on how to address evaluation throughout a proposal—from the project summary to the budget justification.

File: Click Here
Type: Doc

Author(s): Corey Smith, Emma Perk