We EvaluATE - Proposal Development

Vlog: Resources to Help with Evaluation Planning for ATE Proposals

Posted on September 6, 2017 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Evaluation is an important element of an ATE proposals.  EvaluATE has developed several resources to help you develop your evaluation plans and integrate them into your ATE proposals.  This video highlights a few of them—these and more can be accessed from the links below the video.

Additional Resources:

Blog: Three Tips for a Strong NSF Proposal Evaluation Plan

Posted on August 17, 2016 by  in Blog ()

Principal Research Scientist, Education Development Center, Inc.

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Leslie Goodyear and I’m an evaluator who also served as a program officer for three years at the National Science Foundation in the Division of Research on Learning, which is in the Education and Human Resources Directorate. While I was there, I oversaw evaluation activities in the Division and reviewed many, many evaluation proposals and grant proposals with evaluation sections.

In May 2016, I had the pleasure of participating in the “Meeting Requirements, Exceeding Expectations: Understanding the Role of Evaluation in Federal Grants.” Hosted by Lori Wingate at EvaluATE and Ann Beheler at the Centers Collaborative for Technical Assistance, this webinar covered topics such as evaluation fundamentals; evaluation requirements and expectations; and evaluation staffing, budgeting and utilization.

On the webinar, I shared my perspective on the role of evaluation at NSF, strengths and weaknesses of evaluation plans in proposals, and how reviewers assess Results from Prior NSF Support sections of proposals, among other topics. In this blog, I’ll give a brief overview of some important takeaways from the webinar.

First, if you’re making a proposal to education or outreach programs, you’ll likely need to include some form of project evaluation in your proposal. Be sure to read the program solicitation carefully to know what the specific requirements are for that program. There are no agency-wide evaluation requirements—instead they are specified in each solicitation. Lori had a great suggestion on the webinar:  Search the solicitation for “eval” to make sure you find all the evaluation-related details.

Second, you’ll want to make sure that your evaluation plan is tailored to your proposed activities and outcomes. NSF reviewers and program officers can smell a “cookie cutter” evaluation plan, so make sure that you’ve talked with your evaluator while developing your proposal and that they’ve had the chance to read the goals and objectives of your proposed work before drafting the plan. You want the plan to be incorporated into the proposal so that it appears seamless.

Third, indicators of a strong evaluation plan include carefully crafted, relevant overall evaluation questions, a thoughtful project logic model, a detailed data collection plan that is coordinated with project activities, and a plan for reporting and dissemination of findings. You’ll also want to include a bio for your evaluator so that the reviewers know who’s on your team and what makes them uniquely qualified to carry out the evaluation of your project.

Additions that can make your plan “pop” include:

  • A table that maps out the evaluation questions to the data collection plans. This can save space by conveying lots of information in a table instead of in narrative.
  • Combining the evaluation and project timelines so that the reviewers can see how the evaluation will be coordinated with the project and offer timely feedback.

Some programs allow for using the Supplemental Documents section for additional evaluation information. Remember that reviewers are not required to read these supplemental docs, so be sure that the important information is still in the 15-page proposal.

For the Results of Prior NSF Support section, you want to be brief and outcome-focused. Use this space to describe what resulted from the prior work, not what you did. And be sure to be clear how that work is informing the proposed work by suggesting, for example, that these outcomes set up the questions you’re pursuing in this proposal.

Blog: Getting Ready to Reapply – Highlighting Results of Prior Support

Posted on December 2, 2015 by  in Blog ()

Founder and President, EvalWorks, LLC

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Hello. My name is Amy A. Germuth and I own EvalWorks, LLC, an education evaluation firm in Durham, NC, which has a strong focus on evaluating STEM projects. Having conducted evaluations of ATE and multiple other NSF STEM projects since the early 2000s, I have worked with PIs to help them better respond to NSF solicitations.

For every ATE solicitation, NSF has required that proposers identify the “Results of Prior Support.” NSF requests that proposers provide the following information:

  1. The NSF award number, amount and period of support
  2. The title of the project
  3. A summary of the results of the completed work
  4. A list of publications resulting from the NSF award
  5. A brief description of available data, samples, physical collections, and other related research products not described elsewhere
  6. If the proposal is for renewal of a grant, a description of the relation of the completed work to the proposed work

This is an excellent opportunity for proposers who have been funded previously by NSF to highlight how their prior funds were used to support a positive change among the targeted group or individuals. For point 3, rather than simply stating the number of persons served, proposers should do the following:

  • State briefly the main goal(s) of the project.
  • Identify who was served, how many were served, and in what capacity.
  • Explain the impact on these persons that resulted from their participation in this project.
  • Provide what evidence was used to make the above inference.

An example may read something like this:

“As part of this project, our goal was to increase the number of women who successfully earned an associate’s degree in welding. To this end, we began a targeted recruiting campaign focusing on women who were about to complete or had recently completed other related programs such as pipefitting and construction and developed a brochure for new students that included positive images of women in welding. We used funding to develop the Women in Welding program and support team building and outreach efforts by them. Institutional data reveal that since this project was started, the number of women in the welding program has almost tripled from 12 (2006 – 2010), of which only 8 graduated to 34 (2011 – 2016), of which 17 have already graduated and 5 have only one semester left. Even if the remaining 17 were not to graduate, the 17 who already have is double the number of female students who graduated from the program between 2006 – 2010.”

To summarize, if you have received prior support from NSF, use this opportunity to show how the funding supported project activities that made a difference and how they inform your current proposal (if applicable). Reviewers look to this section as a way to ascertain the degree to which you have been a good steward of the funding that you received and what impacts it had. Attention to this section will provide one more measure by which reviewers will judge the ability of your proposed project to be successful.

Blog: Intellectual Merit and Broader Impacts: Identifying Your Project’s Achievements and Supporting Evidence

Posted on October 21, 2015 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

The deadline for proposals to the National Science Foundation’s Advanced Technological Education program for this year just passed, so a blog about what to include in the Results from Prior NSF Support section in an ATE proposal may seem untimely. But if you’re involved in a project that will seek additional funding from NSF in the next year or two, there’s no better time than NOW to assess the quality and quantity of the evidence of your current project’s intellectual merit and broader impacts (NSF’s review criteria for proposed and completed work).

Kirk Knestis (Hezel Associates) and I are presenting a workshop on this topic the day this blog is being published. Here’s an overview of some of what we’re covering.

Understand the fundamentals of intellectual merit and broader impacts: In a nutshell, intellectual merit is about advancing knowledge and understanding. Broader impacts are benefits to society. If your project is mainly research, it’s likely that most of your achievements are related to intellectual merit. If you’re mostly doing development and implementation, your achievements are probably more in the area of broader impacts. But you should have evidence of both aspects of your work.

Identify your project’s intellectual merit and broader impacts: To hone in on your project’s intellectual merit and broader impacts, it helps to break down these big ideas into smaller chunks. To identify your project’s intellectual merit, ask yourself, what are we doing that is generating new knowledge or improved understanding? Are we using novel research methods or investigating a novel topic to better understand an aspect of STEM education? Is our project transformative, bringing about extraordinary or revolutionary change? In terms of broader impacts, what are we doing to serve groups that have been historically underrepresented in STEM; developing a diverse workforce; creating partnerships between academia and industry; enhancing education infrastructure; increasing economic competitiveness; or improving STEM education in general?

Identify gaps in evidence: It’s not enough to profess your achievements—you need evidence. Evidence is not the method you used to collect data (tests, surveys, observations, etc.); it’s the evidence indicated by those data (a genetic test is not evidence that someone committed a crime, the result of that test is the evidence). If you don’t have good evidence of important achievements, revise your evaluation plan and start collecting data as soon as possible. Make sure that you have evidence of more than just the completion of activities. For example, if your achievement is that you developed a new certification program, to demonstrate broader impacts, you need evidence that it is a high-quality program and that students are enrolling, graduating, and getting jobs (or at least go as far down the outcomes chain as reasonable). Plotting your evidence on a logic model is a good way to figure out if you have sufficient evidence regarding outcomes as well as activities and outputs.

If you find gaps that will impair your ability to make a compelling case about what you’ve achieved with your current grant, update your evaluation plan accordingly. When you write your next proposal, you will be required to present evidence of your achievements under the specific headings of “Intellectual Merit” and “Broader Impacts” – if you don’t, your proposal is at risk of being returned without review.

 

To learn more, check out these resources:

NSF Grant Proposal Guide (this link goes directly to the section on Results from Prior NSF Support): http://bit.ly/nsf-results

NSF Merit Review Website: http://www.nsf.gov/bfa/dias/policy/merit_review/

NSF Important Notice #130: Transformative Research (for details about what NSF considers transformative, one dimension of intellectual merit): http://www.nsf.gov/pubs/2007/in130/in130.jsp

NSF Examples of Broader Impacts: http://www.nsf.gov/pubs/2002/nsf022/bicexamples.pdf

Perspectives on Broader Impacts: http://www.nsf.gov/od/oia/publications/Broader_Impacts.pdf

National Alliance for Broader Impacts: http://broaderimpacts.net/about/

Materials from our ATE PI conference workshop on this topic, including presentation slides, worksheets, and EvaluATE’s Results from Prior NSF Support Checklist: http://www.evalu-ate.org/library/conference/pi-conference/

Blog: EvaluATE to the Rescue!

Posted on September 16, 2015 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

8 Resources to help with ATE proposal evaluation plans

The EvaluATE team is busy preparing our proposal for a third round of funding from the National Science Foundation to continue our work as a support center serving the ATE community. So, it’s a good time to remind folks of the EvaluATE resources that may come in handy at proposal development time.

Evaluation Planning Checklist for ATE Proposals
This checklist identifies all the areas in your ATE proposal in which information related to the project’s evaluation should appear, with guidance on what you need to do to present a strong evaluation plan.

Evaluation: Don’t Submit Your ATE Proposal Without It
This webinar from August 2015 focused on developing evaluation plans for ATE proposals. We reviewed the contents of the evaluation planning checklist (see above) in detail, with illustrative examples. Also check out our 2014 webinar on the same topic, featuring the perspectives of an ATE PI, evaluator, and program officer in addition to the EvaluATE team.

10 Helpful Hints and 10 Fatal Flaws: Writing Better Evaluation Sections in Your Proposals
Elizabeth Teles, former ATE program co-lead and member of EvaluATE’s National Visiting Committee, offers her advice on ways to strengthen your proposal’s evaluation plan and avoid common serious mistakes.

ATE Logic Model Template
A logic model isn’t required for ATE proposals, but it is a useful and efficient way to communicate an overview of what you intend to do and achieve with your ATE funding. This template provides a format for you to identify your project’s activities, outputs (products), and outcomes.

Data Collection Planning Matrix
An evaluation plan needs to describe what data will be collected and how, from what sources, by whom, and when, as well as how the data will be analyzed. This worksheet prompts you to record this information in table format, which may then be copied into a proposal’s project description or supplementary document.

ATE Annual Survey Findings
The ATE survey, conducted annually since 2000, provides aggregate information about ATE-funded projects and centers. The survey data may be used to demonstrate a particular need within the ATE program or describe your project’s past performance in relation to the program overall.

Checklists for the Common Guidelines for Education Research and Development
If your proposal is for targeted research or includes a research component, you should show familiarity with the Common Guidelines for Education Research and Development, published jointly by the National Science Foundation and Institute of Education Sciences. EvaluATE’s checklists serve as a quick-start guide to those guidelines.

Project Resume Checklist
If you are applying for renewal funding, a project resume is an efficient means for communicating your past productivity and capacity for future work to reviewers. The checklist explains what to include in a project resume and how. See also our May 2015 webinar on this topic for more information.

And if you haven’t seen it yet, check out the latest issue of our summer ’15 newsletter, which is devoted to evaluation-related issues for ATE proposals.

Blog: Evaluation Plan Development for Grant Writing

Posted on March 25, 2015 by  in Blog (, )

Dean, Institutional Effectiveness & Grants, Pensacola State College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

As dean of institutional effectiveness and grants I have varied responsibilities, but at heart, I am a grant writer. I find it easy to write a needs statement based on available data; more challenging is the process of developing an effective evaluation plan for a proposed project.

A lot of time and effort – taxpayer supported – go into project evaluation, an increasingly significant component of federal grant applications, as illustrated by the following examples:

  • My college partners on two existing U.S. Department of Labor Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grants – almost $2 billion nationally to expand training for the unemployed – which allow up to 10 percent of project budgets to be designated for mandatory external evaluation.
  • We have an $8.5 million U.S. Department of Health & Human Services Health Profession Opportunity Grant demonstration project. Part of that “demonstration” included mandatory participation in activities conducted by contracted external evaluators.

We recently submitted grant applications under the highly competitive U.S. Department of Education Student Support Services (SSS) Program. My college has a long-term SSS program, which meets all of its objectives so we’ll receive “extra” prior experience points. We are assured refunding, right? Maybe, as long as we address competitive preference priorities and score better than perfect – every point counts.

Although external evaluation is not required, when comparing language excerpted from the last three SSS competitions, it is clear that there is a much greater emphasis on the details of an evaluation plan. The guidelines require a detailed description of what types of data will be collected and how the applicant will use the information collected in the evaluation of project activities. It is no longer sufficient to just say “project staff will collect quantitative and qualitative data and use this information for project improvement.”

Our successful evaluation plans start with a detailed logic model, which allows us to make realistic projections of what we hope will happen and plan data collection around the project’s key activities and outcomes. We use these guiding questions to help formulate the details:

  • What services will be provided?
  • What can be measured?
    • perceptions, participation, academic progress
  • What information sources will be available?
  • What types of data will be collected?
    • student records, surveys, interviews, activity-specific data
  • How will we review and analyze the data collected?
  • What will we do with the findings?
    • Specific actions

Unlike universities, most community and state colleges are not hotbeds of research and evaluation. So what can grant writers do to prepare themselves to meet the “evaluation plan” challenge?

  • Make friends with a statistician; they tend to hang out in the Mathematics or Institutional Research departments.
  • Take a graduate level course in educational statistics. If you’re writing about something it is helpful to have rudimentary knowledge of what you write.
  • Find good resources. I have several textbook-like evaluation manuals, but my go-to, dog-eared guide for developing an evaluation plan is the National Science Foundation’s “2010 User-Friendly Handbook for Project Evaluation” (Logic Model information in Chapter 3).
  • An open-access list of Institutional Research (IR) Links located on the Association for Institutional Research website (AIR; a membership organization), provides more than 2200 links to external IR Web pages on a variety of topics related to data and decisions for higher education.
  • Community College Research Center (CCRC) resources, such as publications on prior research, can guide evaluation plan development (http://ccrc.tc.columbia.edu/). The CCRC FAQs Web page provides national data useful for benchmarking your grant program’s projected outcomes.