Archive: proposals

Blog: What Grant Writers Need to Know About Evaluation

Posted on September 4, 2019 by  in Blog (, )

District Director of Grants and Educational Services, Coast Community College District

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Fellow grant writers: Do you ever stop and ask yourselves, “Why do we write grants?” Do you actually enjoy herding cats, pulling teeth, and the inevitable stress of a looming proposal deadline? I hope not. Then what is the driver? We shouldn’t write a grant just to get funded or to earn prestige for our colleges. Those benefits may be motivators, but we should write to get funding and support to positively impact our students, faculty, and the institutions involved. And we should be able to evaluate those results in useful and meaningful ways so that we can identify how to improve and demonstrate the project’s value.

Evaluation isn’t just about satisfying a promise or meeting a requirement to gather and report data. It’s about gathering meaningful data that can be utilized to determine the effectiveness of an activity and the impact of a project. When developing a grant proposal, one often starts with the goals, then thinks of the objectives, and then plans the activities, hoping that in the end, the evaluation data will prove that the goals were met and the project was a success. That requires a lot of “hope.”

I find it more promising to begin with the end in mind from an evaluation perspective: What is the positive change that we hope to achieve and how will it be evidenced? What does success mean? How can we tell if we have been successful? When will we know? And how can we get participants to provide the information we will need for the evaluation?

The role of a grant writer is too often like that of a quilt maker, delegating sections of the proposal’s development to different members of the institution, with the evaluation section often outsourced to a third-party evaluator. Each party submits their content, then the grant writer scrambles to patch it all together.

Instead of quilt making, the process should be more like the construction of a tapestry. Instead of chunks of material stitched together in independent sections, each thread is carefully woven in a thoughtful way to create a larger, more cohesive overall design. It is important that the entire professional development team works together to fully understand each aspect of the proposal. In this way, they can collaboratively develop a coherent plan to obtain the desired outcomes. The project work plan, budget, and evaluation components should not be designed or executed independently—they occur simultaneously and are dependent upon each other. Thus, they should tie together in a thoughtful manner.

I encourage you to think like an evaluator as you develop your proposals. Prepare yourself and challenge your team to be able to justify the value of each goal, objective, and activity and be able to explain how that value will be measured. If at all possible, involve your external or internal evaluator early on in proposal development. The better the evaluator understands your overall concept and activities, the better they can tailor the evaluation plan to derive the desired results. A strong work plan and evaluation plan will help proposal reviewers connect the dots and see the potential of your proposal. These elements will also serve as road maps to success for your project implementation team.

 

For questions or further information please reach out to the author, Lara Smith.

Checklist: Evaluation Plan for ATE Proposals

Posted on July 19, 2019 by  in Resources ()

Updated July 2019!

This checklist provides information on what should be included in evaluation plans for proposals to the
National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. Grant seekers should carefully read the most recent ATE program solicitation (ATE Program Solicitation) for details about the program and proposal submission requirements.

File: Click Here
Type: Checklist
Category: Proposal Development
Author(s): Lori Wingate

Webinar: Evaluation: The Secret Sauce in Your ATE Proposal

Posted on July 3, 2019 by , , in Webinars

Presenter(s): Emma Perk, Lyssa Wilson Becho, Michael Lesiecki
Date(s): August 21, 2019
Time: 1:00pm-2:30pm Eastern
Recording: https://youtu.be/XZCfd7m6eNA

Planning to submit a proposal to the National Science Foundation’s Advanced Technological Education (ATE) program? Then this is a webinar you don’t want to miss! We will cover the essential elements of an effective evaluation plan and show you how to integrate them into an ATE proposal. We will also provide guidance on how to budget for an evaluation, locate a qualified evaluator, and use evaluative evidence to describe the results from prior NSF funding. Participants will receive the Evaluation Planning Checklist for ATE Proposals and other resources to help integrate evaluation into their ATE proposals.

An extended 30-minute Question and Answer session will be included at the end of this webinar. So, come prepared with your questions!

 

Resources:
Slides
External Evaluator Visual
External Evaluator Timeline
ATE Evaluation Plan Checklist
ATE Evaluation Plan Template
Guide to Finding and Selecting an ATE Evaluator
ATE Evaluator Map
Evaluation Data Matrix
NSF Evaluator Biosketch Template
NSF ATE Program Solicitation
Question and Answer Panel Recording

Blog: An Evaluative Approach to Proposal Development*

Posted on June 27, 2019 by  in Blog - ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

A student came into my office to ask me a question. Soon after she launched into her query, I stopped her and said I wasn’t the right person to help because she was asking about a statistical method that I wasn’t up-to-date on. She said, “Oh, you’re a qualitative person?” And I answered, “Not really.” She left looking puzzled. The exchange left me pondering the vexing question, “What am I?” (Now imagine these words echoing off my office walls in a spooky voice for a couple of minutes.) After a few uncomfortable moments, I proudly concluded, “I am a critical thinker!”  

Yes, evaluators are trained specialists with an arsenal of tools, strategies, and approaches for data collection, analysis, and reporting. But critical thinking—evaluative thinking—is really what drives good evaluation. In fact, the very definition of critical thinking—“the mental process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion”2—describes the evaluation process to a T. Applying your critical, evaluative thinking skills in developing your funding proposal will go a long way toward ensuring your submission is competitive.

Make sure all the pieces of your proposal fit together like a snug puzzle. Your proposal needs both a clear statement of the need for your project and a description of the intended outcomes—make sure these match up. If you struggle with the outcome measurement aspect of your evaluation plan, go back to the rationale for your project. If you can observe a need or problem in your context, you should be able to observe the improvements as well.

Be logical. Develop a logic model to portray how your project will translate its resources into outcomes that address a need in your context. Sometimes simply putting things in a graphic format can reveal shortcomings in a project’s logical foundation (like when important outcomes can’t be tracked back to planned activities). The narrative description of your project’s goals, objectives, deliverables, and activities should match the logic model.

Be skeptical. Project planning and logic model development typically happen from an optimistic point of view. (“If we build it, they will come.”) When creating your work plan, step back from time to time and ask yourself and your colleagues, What obstacles might we face? What could really mess things up? Where are the opportunities for failure? And perhaps most important, ask, Is this really the best solution to the need we’re trying to address? Identify your plan’s weaknesses and build in safeguards against those threats. I’m all for an optimistic outlook, but proposal reviewers won’t be wearing rose-colored glasses when they critique your proposal and compare it with others written by smart people with great ideas, just like you. Be your own worst critic and your proposal will be stronger for it.

Evaluative thinking doesn’t replace specialized training in evaluation. But even the best evaluator and most rigorous evaluation plan cannot compensate for a disheveled, poorly crafted project plan. Give your proposal a competitive edge by applying your critical thinking skills and infusing an evaluative perspective throughout your project description.

* This blog is a reprint of an article from an EvaluATE newsletter published in summer 2015.

2 dictionary.com

Blog: Utilizing Your Institutional Research Office Resources When Writing a Grant Application

Posted on March 20, 2018 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Deborah Douma
Dean, Grants and Federal Programs, Pensacola State College
Michael Johnston
Director of Institutional Research, Pensacola State College

There are a number of guiding questions that must be answered to develop a successful grant project evaluation plan. The answers to these questions also provide guidance to demonstrate need and develop ambitious, yet attainable, objectives. Data does not exist in a vacuum and can be evaluated and transformed into insight only if it is contextualized with associated activities. This is best accomplished in collaboration with the Institutional Research (IR) office. The Association for Institutional Research’s aspirational statement “highlights the need for IR to serve a broader range of decision makers.”

We emphasize the critical need to incorporate fundamental knowledge of experimental and quasi-experimental design at the beginning of any grant project. In essence, grant projects are experiments—just not necessarily being performed in a laboratory. The design of any experiment is to introduce new conditions. The independent variable is the grant project and the dependent variable is the success of the target population (students, faculty). The ability to properly measure and replicate this scientific process must be established during project planning, and the IR office can be instrumental in the design of your evaluation.

Responding to a program solicitation (or RFP, RFA, etc.) provides the opportunity to establish the need for the project, measurable outcomes, and an appropriate plan for evaluation that can win over the hearts and minds of reviewers, and lead to a successful grant award. Institutional researchers work with the grant office not only to measure outcomes but also to investigate and provide potential opportunities for improvement. IR staff act as data scientists and statisticians while working with grants and become intimately acquainted with the data, collection process, relationships between variables, and the science being investigated. While the term statistician and data scientist are often used synonymously, data scientists do more than just answer hypothesis tests and develop forecasting models; they also identify how variables not being studied may affect outcomes. This allows IR staff to see beyond the questions that are being asked and not only contribute to the development of the results but also identify unexpected structures in the data. Finding alternative structure may lead to further investigation in other areas and more opportunities for other grants.

If a project’s objective is to affect positive change in student retention, it is necessary to know the starting point before any grant-funded interventions are introduced. IR can provide descriptive statistics on the student body and target population before the intervention. This historical data is used not only for trend analysis but also for validation, correcting errors in the data. Validation can be as simple as looking for differences between comparison groups and confirming potential differences are not due to error. IR can also assist with the predictive analytics necessary to establish appropriate benchmarks for measurable objectives. For example, predicting that an intervention will increase retention rates by 10-20% when a 1-2% increase would be more realistic could lead to a proposal being rejected or set the project up for failure. Your IR office can also help ensure that the appropriate quantitative statistical methods are used to analyze the data.

Tip: Involve your IR office from the beginning, during project planning. This will contribute greatly to submitting a competitive application, the evaluation of which provides the guidance necessary for a successful project.

Webinar: Evaluation: All the Funded ATE Proposals Are Doing It

Posted on August 10, 2017 by , in Webinars ()

Presenter(s): Lori Wingate, Mike Lesiecki
Date(s): August 16, 2017
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/7ytTEGt_FoM

Give your proposal a competitive edge with a strong evaluation plan. The National Science Foundation has issued a new solicitation for its Advanced Technological Education (ATE) program. It includes major changes to the guidelines for ATE evaluation plans. Attend this webinar to learn the key elements of a winning evaluation plan and strategies for demonstrating to reviewers that evaluation is an integral part of your project, not an afterthought. In addition, we’ll provide you with specific guidance for how to budget for an evaluation, locate a qualified evaluator, and describe results from prior NSF support with supporting evaluative evidence. You will receive an updated and other tools to help prepare strong evaluation plans.

Resources:
Slides
ATE Proposal Evaluation Plan Template
Data Collection Planning Matrix
Evaluator Biographical Sketch Template for National Science Foundation (NSF) Proposals
Evaluation Planning Checklist for ATE Proposals
Evaluation Questions Checklist for Program Evaluation
Guide to Finding and Selecting an Evaluator
Logic Models: Getting them Right and Using them Well [webinar]
Logic Model Template for ATE Projects and Centers
NSF Prior Support Checklist
Small-Scale Evaluation Webinar

Resource: Finding and Selecting an Evaluator for Advanced Technological Education (ATE) Proposals

Posted on July 13, 2017 by  in Resources ()

All ATE proposals are required to request “funds to support an evaluator independent of the project.” Ideally, this external evaluator should be identified in the project proposal. The information in this guide is for individuals who are able to select and work with an external evaluator at the proposal stage. However, some institutions prohibit selecting an evaluator on a noncompetitive basis in advance of an award being made. Advice for individuals in that situation is provided in an EvaluATE blog and newsletter article.

This guide includes advice on how to locate and select an external evaluator. It is not intended as a guide for developing an evaluation plan or contracting with an evaluator.

File: Click Here
Type: Doc
Category: Resources
Author(s): Lori Wingate

Newsletter: An Evaluative Approach to Proposal Development

Posted on July 1, 2015 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

A student came into my office to ask me a question. Soon after she launched into her query, I stopped her and said I wasn’t the right person to help because she was asking about a statistical method that I wasn’t up-to-date on. She said, “Oh, you’re a qualitative person?” And I answered, “Not really.” She left looking puzzled. The exchange left me pondering the vexing question, “What am I?” (Now imagine these words echoing off my office walls in a spooky voice for a couple of minutes.) After a few uncomfortable moments, I proudly concluded, “I am a critical thinker!”

Yes, evaluators are trained specialists with an arsenal of tools, strategies, and approaches for data collection, analysis, and reporting. But critical thinking—evaluative thinking—is really what drives good evaluation. In fact, the very definition of critical thinking as “the mental process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion”1 describes the evaluation process to a T. Applying your critical, evaluative thinking skills in developing your funding proposal will go a long way toward ensuring your submission is competitive.

Make sure all the pieces of your proposal fit together like a snug puzzle. Your proposal needs both a clear statement of the need for your project and a description of the intended outcomes—make sure these match up. If you struggle with the outcome measurement aspect of your evaluation plan, go back to the rationale for your project. If you can observe a need or problem in your context, you should be able to observe the improvements as well. Show linkages between the need you intend to address, your activities and products, and expected outcomes.
Be logical. Develop a logic model to portray how your project will translate its resources into outcomes that address a need in your context. Sometimes simply putting things in a graphic format can reveal shortcomings in a project’s logical foundation (like when important outcomes can’t be tracked back to activities). The narrative description of your project’s goals, objectives, deliverables, and activities should match the logic model.

Be skeptical. Project planning and logic model development typically happen from an optimistic point of view. (“If we build it, they will come.”) While crafting your work plan, step back from time to time and ask yourself and your colleagues, what obstacles might we face? What could really mess things up? Where are the opportunities for failure? And perhaps most importantly, is this really the best solution to the need we’re trying to address? Identify your plan’s weaknesses and build in safeguards against those threats. I’m all for an optimistic outlook, but proposal reviewers won’t be wearing rose-colored glasses when they critique your proposal and compare it with others written by smart people with great ideas, just like you. Be your own worst critic and your proposal will be stronger for it.

Evaluative thinking doesn’t replace specialized training in evaluation. But even the best evaluator and most rigorous evaluation plan cannot compensate for a disheveled, poorly crafted project plan. Give your proposal a competitive edge by applying your critical thinking skills and infusing an evaluative perspective throughout your project description.

1 dictionary.com