Archive: proposals

Checklist: ATE Evaluation Plan

Posted on August 21, 2018 by  in Resources ()

Updated August 2018!

This checklist provides information on what should be included in evaluation plans for proposals to the
National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. Grant seekers should carefully read the most recent ATE program solicitation (ATE Program Solicitation) for details about the program and proposal submission requirements.

ATE Evaluation Plan Checklist Field Test

EvaluATE invites individuals who are developing proposals for the National Science Foundation’s Advanced Technological Education (ATE) program to field test our updated ATE Evaluation Plan Checklist and provide feedback for improvement.

The field test version of the checklist is available below.

How to participate in the field test:
(1) Use the checklist while developing the evaluation plan for an ATE proposal.
(2) After you have completed your proposal, complete the brief feedback form.

After a few questions about the context of your work, this form will prompt you to answer four open-ended questions about your experience with the checklist:
• What was especially helpful about this checklist?
• What did you find confusing or especially difficult to apply?
• What would you add, change, or remove?
• If using this checklist affected the contents of your evaluation plan or your process for developing it, please describe how it influenced you.

Thank you for your assistance!

File: Click Here
Type: Checklist
Category: Proposal Development
Author(s): Lori Wingate

Blog: Utilizing Your Institutional Research Office Resources When Writing a Grant Application

Posted on March 20, 2018 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Deborah Douma
Dean, Grants and Federal Programs, Pensacola State College
Michael Johnston
Director of Institutional Research, Pensacola State College

There are a number of guiding questions that must be answered to develop a successful grant project evaluation plan. The answers to these questions also provide guidance to demonstrate need and develop ambitious, yet attainable, objectives. Data does not exist in a vacuum and can be evaluated and transformed into insight only if it is contextualized with associated activities. This is best accomplished in collaboration with the Institutional Research (IR) office. The Association for Institutional Research’s aspirational statement “highlights the need for IR to serve a broader range of decision makers.”

We emphasize the critical need to incorporate fundamental knowledge of experimental and quasi-experimental design at the beginning of any grant project. In essence, grant projects are experiments—just not necessarily being performed in a laboratory. The design of any experiment is to introduce new conditions. The independent variable is the grant project and the dependent variable is the success of the target population (students, faculty). The ability to properly measure and replicate this scientific process must be established during project planning, and the IR office can be instrumental in the design of your evaluation.

Responding to a program solicitation (or RFP, RFA, etc.) provides the opportunity to establish the need for the project, measurable outcomes, and an appropriate plan for evaluation that can win over the hearts and minds of reviewers, and lead to a successful grant award. Institutional researchers work with the grant office not only to measure outcomes but also to investigate and provide potential opportunities for improvement. IR staff act as data scientists and statisticians while working with grants and become intimately acquainted with the data, collection process, relationships between variables, and the science being investigated. While the term statistician and data scientist are often used synonymously, data scientists do more than just answer hypothesis tests and develop forecasting models; they also identify how variables not being studied may affect outcomes. This allows IR staff to see beyond the questions that are being asked and not only contribute to the development of the results but also identify unexpected structures in the data. Finding alternative structure may lead to further investigation in other areas and more opportunities for other grants.

If a project’s objective is to affect positive change in student retention, it is necessary to know the starting point before any grant-funded interventions are introduced. IR can provide descriptive statistics on the student body and target population before the intervention. This historical data is used not only for trend analysis but also for validation, correcting errors in the data. Validation can be as simple as looking for differences between comparison groups and confirming potential differences are not due to error. IR can also assist with the predictive analytics necessary to establish appropriate benchmarks for measurable objectives. For example, predicting that an intervention will increase retention rates by 10-20% when a 1-2% increase would be more realistic could lead to a proposal being rejected or set the project up for failure. Your IR office can also help ensure that the appropriate quantitative statistical methods are used to analyze the data.

Tip: Involve your IR office from the beginning, during project planning. This will contribute greatly to submitting a competitive application, the evaluation of which provides the guidance necessary for a successful project.

Webinar: Evaluation: All the Funded ATE Proposals Are Doing It

Posted on August 10, 2017 by , in Webinars ()

Presenter(s): Lori Wingate, Mike Lesiecki
Date(s): August 16, 2017
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/7ytTEGt_FoM

Give your proposal a competitive edge with a strong evaluation plan. The National Science Foundation has issued a new solicitation for its Advanced Technological Education (ATE) program. It includes major changes to the guidelines for ATE evaluation plans. Attend this webinar to learn the key elements of a winning evaluation plan and strategies for demonstrating to reviewers that evaluation is an integral part of your project, not an afterthought. In addition, we’ll provide you with specific guidance for how to budget for an evaluation, locate a qualified evaluator, and describe results from prior NSF support with supporting evaluative evidence. You will receive an updated and other tools to help prepare strong evaluation plans.

Resources:
Slides
ATE Proposal Evaluation Plan Template
Data Collection Planning Matrix
Evaluator Biographical Sketch Template for National Science Foundation (NSF) Proposals
Evaluation Planning Checklist for ATE Proposals
Evaluation Questions Checklist for Program Evaluation
Guide to Finding and Selecting an Evaluator
Logic Models: Getting them Right and Using them Well [webinar]
Logic Model Template for ATE Projects and Centers
NSF Prior Support Checklist
Small-Scale Evaluation Webinar

Resource: Finding and Selecting an Evaluator for Advanced Technological Education (ATE) Proposals

Posted on July 13, 2017 by  in Resources ()

All ATE proposals are required to request “funds to support an evaluator independent of the project.” Ideally, this external evaluator should be identified in the project proposal. The information in this guide is for individuals who are able to select and work with an external evaluator at the proposal stage. However, some institutions prohibit selecting an evaluator on a noncompetitive basis in advance of an award being made. Advice for individuals in that situation is provided in an EvaluATE blog and newsletter article.

This guide includes advice on how to locate and select an external evaluator. It is not intended as a guide for developing an evaluation plan or contracting with an evaluator.

File: Click Here
Type: Doc
Category: Resources
Author(s): Lori Wingate

Newsletter: An Evaluative Approach to Proposal Development

Posted on July 1, 2015 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

A student came into my office to ask me a question. Soon after she launched into her query, I stopped her and said I wasn’t the right person to help because she was asking about a statistical method that I wasn’t up-to-date on. She said, “Oh, you’re a qualitative person?” And I answered, “Not really.” She left looking puzzled. The exchange left me pondering the vexing question, “What am I?” (Now imagine these words echoing off my office walls in a spooky voice for a couple of minutes.) After a few uncomfortable moments, I proudly concluded, “I am a critical thinker!”

Yes, evaluators are trained specialists with an arsenal of tools, strategies, and approaches for data collection, analysis, and reporting. But critical thinking—evaluative thinking—is really what drives good evaluation. In fact, the very definition of critical thinking as “the mental process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion”1 describes the evaluation process to a T. Applying your critical, evaluative thinking skills in developing your funding proposal will go a long way toward ensuring your submission is competitive.

Make sure all the pieces of your proposal fit together like a snug puzzle. Your proposal needs both a clear statement of the need for your project and a description of the intended outcomes—make sure these match up. If you struggle with the outcome measurement aspect of your evaluation plan, go back to the rationale for your project. If you can observe a need or problem in your context, you should be able to observe the improvements as well. Show linkages between the need you intend to address, your activities and products, and expected outcomes.
Be logical. Develop a logic model to portray how your project will translate its resources into outcomes that address a need in your context. Sometimes simply putting things in a graphic format can reveal shortcomings in a project’s logical foundation (like when important outcomes can’t be tracked back to activities). The narrative description of your project’s goals, objectives, deliverables, and activities should match the logic model.

Be skeptical. Project planning and logic model development typically happen from an optimistic point of view. (“If we build it, they will come.”) While crafting your work plan, step back from time to time and ask yourself and your colleagues, what obstacles might we face? What could really mess things up? Where are the opportunities for failure? And perhaps most importantly, is this really the best solution to the need we’re trying to address? Identify your plan’s weaknesses and build in safeguards against those threats. I’m all for an optimistic outlook, but proposal reviewers won’t be wearing rose-colored glasses when they critique your proposal and compare it with others written by smart people with great ideas, just like you. Be your own worst critic and your proposal will be stronger for it.

Evaluative thinking doesn’t replace specialized training in evaluation. But even the best evaluator and most rigorous evaluation plan cannot compensate for a disheveled, poorly crafted project plan. Give your proposal a competitive edge by applying your critical thinking skills and infusing an evaluative perspective throughout your project description.

1 dictionary.com

Newsletter: Tips for Writing the Results of Prior Support Section for NSF Proposals

Posted on July 1, 2014 by  in Newsletter - ()

Where are the hidden opportunities to positively influence proposal reviewers?  Surprisingly, this is often the Results from Prior Support section. Many proposers do not go beyond simply recounting what they did in prior grants. They miss the chance to “wow” the reader with impact examples, such as Nano-Link’s Nano-Infusion Project that has resulted in the integration and inclusion of nanoscale modules into multiple grade levels of K-14 across the nation. Teachers are empowered with tools to effectively teach nanoscale concepts as evidenced by their survey feedback. New leaders are emerging and enthusiasm for science can be seen on the videos available on the website. Because of NSF funding, additional synergistic projects allowed for scaling activities and growing a national presence.

Any PI having received NSF support in the past 5 years must include a summary of the results (up to 5 pages) and how those results support the current proposal. Because pages in this subsection count toward the total 15 pages, many people worry that they are using too much space to describe what has been done. These pages, however, can provide a punch and energy to the proposal with metrics, outcomes, and stories. This is the time to quote the evaluator’s comments and tie the results to the evaluation plan. The external view provides valuable decision-making information to the reviewers. This discussion of prior support helps reviewers evaluate the proposal, allows them to make comments, and provides evidence that the new activities will add value.According to the NSF Grant Proposal Guide, updated in 2013, the subsection must include: Award #, amount, period of support; title of the project; summary of results described under the distinct separate headings of Intellectual Merit, and Broader Impact; publications acknowledging NSF support; evidence of research products and their availability; and relation of completed work to proposed work.

The bottom line is that the beginning of the project description sets the stage for the entire proposal. Data and examples that demonstrate intellectual merit and broader impact clearly define what has been done thus leaving room for a clear description of new directions that will require funding.

 

Newsletter: What makes a good evaluation section of a proposal?

Posted on July 1, 2013 by  in Newsletter - ()

Principal Research Scientist, Education Development Center, Inc.

As a program officer, I read hundreds of proposals for different NSF programs and I saw many different approaches to writing a proposal evaluation section. From my vantage point, here are a few tips that may help to ensure that your evaluation section shines.

First, make sure to involve your evaluator in writing the proposal’s evaluation section. Program officers and reviewers can tell when an evaluation section was written without the consultation of an evaluator. This makes them think you aren’t integrating evaluation into your project planning.

Don’t just call an evaluator a couple weeks before the proposal is due! A strong evaluation section comes from a thoughtful, robust, tailored evaluation plan. This takes collaboration with an evaluator! Get them on board early and talk with them often as you develop your proposal. They can help you develop measureable objectives, add insight to proposal organization, and, of course, work with you to develop an appropriate evaluation plan.

Reviewers and program officers look to see that the evaluator understands the project. This can be done using a logic model or in a paragraph that justifies the evaluation design, based on the proposed project design. The evaluation section should also connect the project objectives and targeted outcomes to evaluation questions, data collection methods and analysis, and dissemination plans. This can be done in a matrix format, which helps the reader to see clearly which data will answer which evaluation question and how these are connected to the objectives of the project.

A strong evaluation plan shows that the evaluator and the project team are in synch and working together, applies a rigorous design and reasonable data collection methods, and answers important questions that will help to demonstrate the value of the project and surface areas for improvement.