A student came into my office to ask me a question. Soon after she launched into her query, I stopped her and said I wasn’t the right person to help because she was asking about a statistical method that I wasn’t up-to-date on. She said, “Oh, you’re a qualitative person?” And I answered, “Not really.” She left looking puzzled. The exchange left me pondering the vexing question, “What am I?” (Now imagine these words echoing off my office walls in a spooky voice for a couple of minutes.) After a few uncomfortable moments, I proudly concluded, “I am a critical thinker!”
Yes, evaluators are trained specialists with an arsenal of tools, strategies, and approaches for data collection, analysis, and reporting. But critical thinking—evaluative thinking—is really what drives good evaluation. In fact, the very definition of critical thinking as “the mental process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion”1 describes the evaluation process to a T. Applying your critical, evaluative thinking skills in developing your funding proposal will go a long way toward ensuring your submission is competitive.
Make sure all the pieces of your proposal fit together like a snug puzzle. Your proposal needs both a clear statement of the need for your project and a description of the intended outcomes—make sure these match up. If you struggle with the outcome measurement aspect of your evaluation plan, go back to the rationale for your project. If you can observe a need or problem in your context, you should be able to observe the improvements as well. Show linkages between the need you intend to address, your activities and products, and expected outcomes.
Be logical. Develop a logic model to portray how your project will translate its resources into outcomes that address a need in your context. Sometimes simply putting things in a graphic format can reveal shortcomings in a project’s logical foundation (like when important outcomes can’t be tracked back to activities). The narrative description of your project’s goals, objectives, deliverables, and activities should match the logic model.
Be skeptical. Project planning and logic model development typically happen from an optimistic point of view. (“If we build it, they will come.”) While crafting your work plan, step back from time to time and ask yourself and your colleagues, what obstacles might we face? What could really mess things up? Where are the opportunities for failure? And perhaps most importantly, is this really the best solution to the need we’re trying to address? Identify your plan’s weaknesses and build in safeguards against those threats. I’m all for an optimistic outlook, but proposal reviewers won’t be wearing rose-colored glasses when they critique your proposal and compare it with others written by smart people with great ideas, just like you. Be your own worst critic and your proposal will be stronger for it.
Evaluative thinking doesn’t replace specialized training in evaluation. But even the best evaluator and most rigorous evaluation plan cannot compensate for a disheveled, poorly crafted project plan. Give your proposal a competitive edge by applying your critical thinking skills and infusing an evaluative perspective throughout your project description.
Where are the hidden opportunities to positively influence proposal reviewers? Surprisingly, this is often the Results from Prior Support section. Many proposers do not go beyond simply recounting what they did in prior grants. They miss the chance to “wow” the reader with impact examples, such as Nano-Link’s Nano-Infusion Project that has resulted in the integration and inclusion of nanoscale modules into multiple grade levels of K-14 across the nation. Teachers are empowered with tools to effectively teach nanoscale concepts as evidenced by their survey feedback. New leaders are emerging and enthusiasm for science can be seen on the videos available on the website. Because of NSF funding, additional synergistic projects allowed for scaling activities and growing a national presence.
Any PI having received NSF support in the past 5 years must include a summary of the results (up to 5 pages) and how those results support the current proposal. Because pages in this subsection count toward the total 15 pages, many people worry that they are using too much space to describe what has been done. These pages, however, can provide a punch and energy to the proposal with metrics, outcomes, and stories. This is the time to quote the evaluator’s comments and tie the results to the evaluation plan. The external view provides valuable decision-making information to the reviewers. This discussion of prior support helps reviewers evaluate the proposal, allows them to make comments, and provides evidence that the new activities will add value.According to the NSF Grant Proposal Guide, updated in 2013, the subsection must include: Award #, amount, period of support; title of the project; summary of results described under the distinct separate headings of Intellectual Merit, and Broader Impact; publications acknowledging NSF support; evidence of research products and their availability; and relation of completed work to proposed work.
The bottom line is that the beginning of the project description sets the stage for the entire proposal. Data and examples that demonstrate intellectual merit and broader impact clearly define what has been done thus leaving room for a clear description of new directions that will require funding.
As a program officer, I read hundreds of proposals for different NSF programs and I saw many different approaches to writing a proposal evaluation section. From my vantage point, here are a few tips that may help to ensure that your evaluation section shines.
First, make sure to involve your evaluator in writing the proposal’s evaluation section. Program officers and reviewers can tell when an evaluation section was written without the consultation of an evaluator. This makes them think you aren’t integrating evaluation into your project planning.
Don’t just call an evaluator a couple weeks before the proposal is due! A strong evaluation section comes from a thoughtful, robust, tailored evaluation plan. This takes collaboration with an evaluator! Get them on board early and talk with them often as you develop your proposal. They can help you develop measureable objectives, add insight to proposal organization, and, of course, work with you to develop an appropriate evaluation plan.
Reviewers and program officers look to see that the evaluator understands the project. This can be done using a logic model or in a paragraph that justifies the evaluation design, based on the proposed project design. The evaluation section should also connect the project objectives and targeted outcomes to evaluation questions, data collection methods and analysis, and dissemination plans. This can be done in a matrix format, which helps the reader to see clearly which data will answer which evaluation question and how these are connected to the objectives of the project.
A strong evaluation plan shows that the evaluator and the project team are in synch and working together, applies a rigorous design and reasonable data collection methods, and answers important questions that will help to demonstrate the value of the project and surface areas for improvement.