We EvaluATE - Proposal Development

Blog: Evaluation Feedback Is a Gift

Posted on July 3, 2018 by  in Blog ()

Chemistry Faculty, Anoka-Ramsey Community College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Christopher Lutz, chemistry faculty at Anoka-Ramsey Community College. When our project was initially awarded, I was a first-time National Science Foundation (NSF) principal investigator. I understood external evaluation was required for grants but saw it as an administrative hurdle in the grant process. I viewed evaluation as proof for the NSF that we did the project and as a metric for outcomes. While both of these aspects are important, I learned evaluation is also an opportunity to monitor and improve your process and grant. Working with our excellent external evaluators, we built a stronger program in our grant project. You can too, if you are open to evaluation feedback.

Our evaluation team was composed of an excellent evaluator and a technical expert. I started working with both about halfway through the proposal development process (a few months before submission) to ensure they could contribute to the project. I recommend contacting evaluators during the initial stages of proposal development and checking in several times before submission. This gives adequate time for your evaluators to develop a quality evaluation plan and gives you time to understand how to incorporate your evaluator’s advice. Our funded project yielded great successes, but we could have saved time and achieved more if we had involved our evaluators earlier in the process.

After receiving funding, we convened grant personnel and evaluators for a face-to-face meeting to avoid wasted effort at the project start. Meeting in person allowed us to quickly collaborate on a deep level. For example, our project evaluator made real-time adjustments to the evaluation plan as our academic team and technical evaluator worked to plan our project videos and training tools. Include evaluator travel funds in your budget and possibly select an evaluator who is close by. We did not designate travel funds for our Kansas-based evaluator, but his ties to Minnesota and understanding of the value of face-to-face collaboration led him to use some of his evaluation salary to travel and meet with our team.

Here are three ways we used evaluation feedback to strengthen our project:

Example 1: The first-year evaluation report showed a perceived deficiency in the project’s provision of hands-on experience with MALDI-MS instrumentation. In response, we had students make small quantities of liquid solution instead of giving pre-mixed solutions, and let them analyze more lab samples. This change required minimal time but led students to regard the project’s hands-on nature as a strength in the second-year evaluation.

Example 2: Another area for improvement was students’ lack of confidence in analyzing data. In response to this feedback, project staff create Excel data analysis tools and a new training activity for students to practice with literature data prior to analyzing their own. The subsequent year’s evaluation report indicated increased student confidence.

Example 3: Input from our technical evaluator allowed us to create videos that have been used in academic institutions in at least three US states, the UK’s Open University system, and Iceland.

Provided here are some overall tips:

  1. Work with your evaluator(s) early in the proposal process to avoid wasted effort.
  2. Build in at least one face-to-face meeting with your evaluator(s).

Review evaluation data and reports with the goal of improving your project in the next year.

Consider external evaluators as critical friends who are there to help improve your project. This will help move your project forward and help you have a greater impact for all.

Blog: Modifying Grant Evaluation Project Objectives

Posted on June 11, 2018 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Evelyn Brown
Director, Extension Research and Development
NC State Industry Expansion Solutions
Leressa Suber
Evaluation Coordinator
NC State Industry Expansion Solutions

When performing grant evaluations, our clients develop specific project objectives to drive attainment of overall grant goals. We work with principal investigators (PIs) to monitor work plan activities and project outcomes to ensure objectives are attainable, measurable, and sustainable.

However, what happens when the project team encounters obstacles to starting the activities related to project objectives? What shifts need to be made to meet grant goals?

When the team determines that the project objective cannot be achieved as initially planned, it’s important for the PI and evaluator to determine how to proceed. In the table below, we’ve highlighted three scenarios in which it may be necessary to shift, change, or eliminate a project objective. Then, if changes are made, based on the extent of the project objective modifications, the team can determine if or when the PI should notify the project funder.

Example: Shift in Project Objective

Grant Goal Help underclassmen understand what engineers do by observing the day-to-day activities of a local engineer.
Problem The advisory board members (engineers) in the field were unavailable.
Objective Current: Shadow advisory board member. Change: Shadow young engineering alumni.
Result The goal is still attainable.
PI Notify Funder No, but provide explanation/justification in the end-of-year report.

Example: Change a Project Objective

Grant Goal To create a method by which students at the community college will earn a credential to indicate they are prepared for employment in a specific technical field.
Problem The state process to establish a new certificate is time consuming and can’t occur within the grant period.
Objective Current: Complete degree in specific technical field. Change: Complete certificate in specific technical field.
Result The goal is still attainable.
PI Notify Funder Yes, specifically contact the funding program officer.

Example: Eliminate the Project Objective

Grant Goal The project participant’s salary will increase as result of completing specific program.
Problem Following program exit, salary data is unavailable.
Objective Current: Compare participant’s salary at start of program to salary three months after program completion. Change: Unable to maintain contact with program completers to obtain salary information.
Result The goal cannot realistically be measured.
PI Notify Funder Yes, specifically contact funding program officer.

In our experience working with clients, we’ve found that the best way to minimize the need to modify project objectives is to ensure they are well written during the grant proposal phase.

Tips: How to write attainable project objectives.

1. Thoroughly think through objectives during grant development phase.

The National Science Foundation (NSF) provides guidance to assist PIs with constructing realistic project goals and objectives. Below, we’ve linked to the NSF’s proposal development guide. However, here are a few key considerations:

  • Are the project objectives clear?
  • Are the resources necessary to accomplish the objectives clearly identified?
  • Are their barriers to accessing the resources needed?

2. Seek evaluator assistance early in the grant proposal process.

Link to additional resources: NSF – A Guide for Proposal Writing

Blog: Utilizing Your Institutional Research Office Resources When Writing a Grant Application

Posted on March 20, 2018 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Deborah Douma
Dean, Grants and Federal Programs, Pensacola State College
Michael Johnston
Director of Institutional Research, Pensacola State College

There are a number of guiding questions that must be answered to develop a successful grant project evaluation plan. The answers to these questions also provide guidance to demonstrate need and develop ambitious, yet attainable, objectives. Data does not exist in a vacuum and can be evaluated and transformed into insight only if it is contextualized with associated activities. This is best accomplished in collaboration with the Institutional Research (IR) office. The Association for Institutional Research’s aspirational statement “highlights the need for IR to serve a broader range of decision makers.”

We emphasize the critical need to incorporate fundamental knowledge of experimental and quasi-experimental design at the beginning of any grant project. In essence, grant projects are experiments—just not necessarily being performed in a laboratory. The design of any experiment is to introduce new conditions. The independent variable is the grant project and the dependent variable is the success of the target population (students, faculty). The ability to properly measure and replicate this scientific process must be established during project planning, and the IR office can be instrumental in the design of your evaluation.

Responding to a program solicitation (or RFP, RFA, etc.) provides the opportunity to establish the need for the project, measurable outcomes, and an appropriate plan for evaluation that can win over the hearts and minds of reviewers, and lead to a successful grant award. Institutional researchers work with the grant office not only to measure outcomes but also to investigate and provide potential opportunities for improvement. IR staff act as data scientists and statisticians while working with grants and become intimately acquainted with the data, collection process, relationships between variables, and the science being investigated. While the term statistician and data scientist are often used synonymously, data scientists do more than just answer hypothesis tests and develop forecasting models; they also identify how variables not being studied may affect outcomes. This allows IR staff to see beyond the questions that are being asked and not only contribute to the development of the results but also identify unexpected structures in the data. Finding alternative structure may lead to further investigation in other areas and more opportunities for other grants.

If a project’s objective is to affect positive change in student retention, it is necessary to know the starting point before any grant-funded interventions are introduced. IR can provide descriptive statistics on the student body and target population before the intervention. This historical data is used not only for trend analysis but also for validation, correcting errors in the data. Validation can be as simple as looking for differences between comparison groups and confirming potential differences are not due to error. IR can also assist with the predictive analytics necessary to establish appropriate benchmarks for measurable objectives. For example, predicting that an intervention will increase retention rates by 10-20% when a 1-2% increase would be more realistic could lead to a proposal being rejected or set the project up for failure. Your IR office can also help ensure that the appropriate quantitative statistical methods are used to analyze the data.

Tip: Involve your IR office from the beginning, during project planning. This will contribute greatly to submitting a competitive application, the evaluation of which provides the guidance necessary for a successful project.

Vlog: Resources to Help with Evaluation Planning for ATE Proposals

Posted on September 6, 2017 by  in Blog ()

Executive Director, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Evaluation is an important element of an ATE proposals.  EvaluATE has developed several resources to help you develop your evaluation plans and integrate them into your ATE proposals.  This video highlights a few of them—these and more can be accessed from the links below the video.

Additional Resources:

Blog: Three Tips for a Strong NSF Proposal Evaluation Plan

Posted on August 17, 2016 by  in Blog ()

Principal Research Scientist, Education Development Center, Inc.

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Leslie Goodyear and I’m an evaluator who also served as a program officer for three years at the National Science Foundation in the Division of Research on Learning, which is in the Education and Human Resources Directorate. While I was there, I oversaw evaluation activities in the Division and reviewed many, many evaluation proposals and grant proposals with evaluation sections.

In May 2016, I had the pleasure of participating in the “Meeting Requirements, Exceeding Expectations: Understanding the Role of Evaluation in Federal Grants.” Hosted by Lori Wingate at EvaluATE and Ann Beheler at the Centers Collaborative for Technical Assistance, this webinar covered topics such as evaluation fundamentals; evaluation requirements and expectations; and evaluation staffing, budgeting and utilization.

On the webinar, I shared my perspective on the role of evaluation at NSF, strengths and weaknesses of evaluation plans in proposals, and how reviewers assess Results from Prior NSF Support sections of proposals, among other topics. In this blog, I’ll give a brief overview of some important takeaways from the webinar.

First, if you’re making a proposal to education or outreach programs, you’ll likely need to include some form of project evaluation in your proposal. Be sure to read the program solicitation carefully to know what the specific requirements are for that program. There are no agency-wide evaluation requirements—instead they are specified in each solicitation. Lori had a great suggestion on the webinar:  Search the solicitation for “eval” to make sure you find all the evaluation-related details.

Second, you’ll want to make sure that your evaluation plan is tailored to your proposed activities and outcomes. NSF reviewers and program officers can smell a “cookie cutter” evaluation plan, so make sure that you’ve talked with your evaluator while developing your proposal and that they’ve had the chance to read the goals and objectives of your proposed work before drafting the plan. You want the plan to be incorporated into the proposal so that it appears seamless.

Third, indicators of a strong evaluation plan include carefully crafted, relevant overall evaluation questions, a thoughtful project logic model, a detailed data collection plan that is coordinated with project activities, and a plan for reporting and dissemination of findings. You’ll also want to include a bio for your evaluator so that the reviewers know who’s on your team and what makes them uniquely qualified to carry out the evaluation of your project.

Additions that can make your plan “pop” include:

  • A table that maps out the evaluation questions to the data collection plans. This can save space by conveying lots of information in a table instead of in narrative.
  • Combining the evaluation and project timelines so that the reviewers can see how the evaluation will be coordinated with the project and offer timely feedback.

Some programs allow for using the Supplemental Documents section for additional evaluation information. Remember that reviewers are not required to read these supplemental docs, so be sure that the important information is still in the 15-page proposal.

For the Results of Prior NSF Support section, you want to be brief and outcome-focused. Use this space to describe what resulted from the prior work, not what you did. And be sure to be clear how that work is informing the proposed work by suggesting, for example, that these outcomes set up the questions you’re pursuing in this proposal.

Blog: Getting Ready to Reapply – Highlighting Results of Prior Support

Posted on December 2, 2015 by  in Blog ()

Founder and President, EvalWorks, LLC

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Hello. My name is Amy A. Germuth and I own EvalWorks, LLC, an education evaluation firm in Durham, NC, which has a strong focus on evaluating STEM projects. Having conducted evaluations of ATE and multiple other NSF STEM projects since the early 2000s, I have worked with PIs to help them better respond to NSF solicitations.

For every ATE solicitation, NSF has required that proposers identify the “Results of Prior Support.” NSF requests that proposers provide the following information:

  1. The NSF award number, amount and period of support
  2. The title of the project
  3. A summary of the results of the completed work
  4. A list of publications resulting from the NSF award
  5. A brief description of available data, samples, physical collections, and other related research products not described elsewhere
  6. If the proposal is for renewal of a grant, a description of the relation of the completed work to the proposed work

This is an excellent opportunity for proposers who have been funded previously by NSF to highlight how their prior funds were used to support a positive change among the targeted group or individuals. For point 3, rather than simply stating the number of persons served, proposers should do the following:

  • State briefly the main goal(s) of the project.
  • Identify who was served, how many were served, and in what capacity.
  • Explain the impact on these persons that resulted from their participation in this project.
  • Provide what evidence was used to make the above inference.

An example may read something like this:

“As part of this project, our goal was to increase the number of women who successfully earned an associate’s degree in welding. To this end, we began a targeted recruiting campaign focusing on women who were about to complete or had recently completed other related programs such as pipefitting and construction and developed a brochure for new students that included positive images of women in welding. We used funding to develop the Women in Welding program and support team building and outreach efforts by them. Institutional data reveal that since this project was started, the number of women in the welding program has almost tripled from 12 (2006 – 2010), of which only 8 graduated to 34 (2011 – 2016), of which 17 have already graduated and 5 have only one semester left. Even if the remaining 17 were not to graduate, the 17 who already have is double the number of female students who graduated from the program between 2006 – 2010.”

To summarize, if you have received prior support from NSF, use this opportunity to show how the funding supported project activities that made a difference and how they inform your current proposal (if applicable). Reviewers look to this section as a way to ascertain the degree to which you have been a good steward of the funding that you received and what impacts it had. Attention to this section will provide one more measure by which reviewers will judge the ability of your proposed project to be successful.

Blog: Intellectual Merit and Broader Impacts: Identifying Your Project’s Achievements and Supporting Evidence

Posted on October 21, 2015 by  in Blog ()

Executive Director, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Updated April 2019 for Throwback Thursday series.

The deadline for proposals to the National Science Foundation’s Advanced Technological Education program for this year just passed, so a blog about what to include in the Results from Prior NSF Support section in an ATE proposal may seem untimely. But if you’re involved in a project that will seek additional funding from NSF in the next year or two, there’s no better time than NOW to assess the quality and quantity of the evidence of your current project’s intellectual merit and broader impacts (NSF’s review criteria for proposed and completed work).

Understand the fundamentals of intellectual merit and broader impacts: In a nutshell, intellectual merit is about advancing knowledge and understanding. Broader impacts are benefits to society. If your project is mainly research, it’s likely that most of your achievements are related to intellectual merit. If you’re mostly doing development and implementation, your achievements are probably more in the area of broader impacts. But you should have evidence of both aspects of your work.

Identify your project’s intellectual merit and broader impacts: To hone in on your project’s intellectual merit and broader impacts, it helps to break down these big ideas into smaller chunks. To identify your project’s intellectual merit, ask yourself, what are we doing that is generating new knowledge or improved understanding? Are we using novel research methods or investigating a novel topic to better understand an aspect of STEM education? Is our project transformative, bringing about extraordinary or revolutionary change? In terms of broader impacts, what are we doing to serve groups that have been historically underrepresented in STEM; developing a diverse workforce; creating partnerships between academia and industry; enhancing education infrastructure; increasing economic competitiveness; or improving STEM education in general?

Identify gaps in evidence: It’s not enough to profess your achievements—you need evidence. Evidence is not the method you used to collect data (tests, surveys, observations, etc.); it’s the evidence indicated by those data (a genetic test is not evidence that someone committed a crime, the result of that test is the evidence). If you don’t have good evidence of important achievements, revise your evaluation plan and start collecting data as soon as possible. Make sure that you have evidence of more than just the completion of activities. For example, if your achievement is that you developed a new certification program, to demonstrate broader impacts, you need evidence that it is a high-quality program and that students are enrolling, graduating, and getting jobs (or at least go as far down the outcomes chain as reasonable). Plotting your evidence on a logic model is a good way to figure out if you have sufficient evidence regarding outcomes as well as activities and outputs.

If you find gaps that will impair your ability to make a compelling case about what you’ve achieved with your current grant, update your evaluation plan accordingly. When you write your next proposal, you will be required to present evidence of your achievements under the specific headings of “Intellectual Merit” and “Broader Impacts” – if you don’t, your proposal is at risk of being returned without review.

 

To learn more, check out these resources:

NSF Grant Proposal Guide (this link goes directly to the section on Results from Prior NSF Support): http://bit.ly/nsf-results

NSF Merit Review Website: http://www.nsf.gov/bfa/dias/policy/merit_review/

NSF Important Notice #130: Transformative Research (for details about what NSF considers transformative, one dimension of intellectual merit): http://www.nsf.gov/pubs/2007/in130/in130.jsp

NSF Examples of Broader Impacts: http://www.nsf.gov/pubs/2002/nsf022/bicexamples.pdf

Perspectives on Broader Impacts: http://www.nsf.gov/od/oia/publications/Broader_Impacts.pdf

National Alliance for Broader Impacts: http://broaderimpacts.net/about/

Materials from our ATE PI conference workshop on this topic, including presentation slides, worksheets, and EvaluATE’s Results from Prior NSF Support Checklist: https://www.evalu-ate.org/library/conference/pi-conference/

Blog: EvaluATE to the Rescue!

Posted on September 16, 2015 by  in Blog ()

Executive Director, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

8 Resources to help with ATE proposal evaluation plans

The EvaluATE team is busy preparing our proposal for a third round of funding from the National Science Foundation to continue our work as a support center serving the ATE community. So, it’s a good time to remind folks of the EvaluATE resources that may come in handy at proposal development time.

Evaluation Planning Checklist for ATE Proposals
This checklist identifies all the areas in your ATE proposal in which information related to the project’s evaluation should appear, with guidance on what you need to do to present a strong evaluation plan.

Evaluation: Don’t Submit Your ATE Proposal Without It
This webinar from August 2015 focused on developing evaluation plans for ATE proposals. We reviewed the contents of the evaluation planning checklist (see above) in detail, with illustrative examples. Also check out our 2014 webinar on the same topic, featuring the perspectives of an ATE PI, evaluator, and program officer in addition to the EvaluATE team.

10 Helpful Hints and 10 Fatal Flaws: Writing Better Evaluation Sections in Your Proposals
Elizabeth Teles, former ATE program co-lead and member of EvaluATE’s National Visiting Committee, offers her advice on ways to strengthen your proposal’s evaluation plan and avoid common serious mistakes.

ATE Logic Model Template
A logic model isn’t required for ATE proposals, but it is a useful and efficient way to communicate an overview of what you intend to do and achieve with your ATE funding. This template provides a format for you to identify your project’s activities, outputs (products), and outcomes.

Data Collection Planning Matrix
An evaluation plan needs to describe what data will be collected and how, from what sources, by whom, and when, as well as how the data will be analyzed. This worksheet prompts you to record this information in table format, which may then be copied into a proposal’s project description or supplementary document.

ATE Annual Survey Findings
The ATE survey, conducted annually since 2000, provides aggregate information about ATE-funded projects and centers. The survey data may be used to demonstrate a particular need within the ATE program or describe your project’s past performance in relation to the program overall.

Checklists for the Common Guidelines for Education Research and Development
If your proposal is for targeted research or includes a research component, you should show familiarity with the Common Guidelines for Education Research and Development, published jointly by the National Science Foundation and Institute of Education Sciences. EvaluATE’s checklists serve as a quick-start guide to those guidelines.

Project Resume Checklist
If you are applying for renewal funding, a project resume is an efficient means for communicating your past productivity and capacity for future work to reviewers. The checklist explains what to include in a project resume and how. See also our May 2015 webinar on this topic for more information.

And if you haven’t seen it yet, check out the latest issue of our summer ’15 newsletter, which is devoted to evaluation-related issues for ATE proposals.

Blog: Evaluation Plan Development for Grant Writing

Posted on March 25, 2015 by  in Blog (, )

Dean of Grants and Federal Programs, Pensacola State College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

As dean of institutional effectiveness and grants I have varied responsibilities, but at heart, I am a grant writer. I find it easy to write a needs statement based on available data; more challenging is the process of developing an effective evaluation plan for a proposed project.

A lot of time and effort – taxpayer supported – go into project evaluation, an increasingly significant component of federal grant applications, as illustrated by the following examples:

  • My college partners on two existing U.S. Department of Labor Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grants – almost $2 billion nationally to expand training for the unemployed – which allow up to 10 percent of project budgets to be designated for mandatory external evaluation.
  • We have an $8.5 million U.S. Department of Health & Human Services Health Profession Opportunity Grant demonstration project. Part of that “demonstration” included mandatory participation in activities conducted by contracted external evaluators.

We recently submitted grant applications under the highly competitive U.S. Department of Education Student Support Services (SSS) Program. My college has a long-term SSS program, which meets all of its objectives so we’ll receive “extra” prior experience points. We are assured refunding, right? Maybe, as long as we address competitive preference priorities and score better than perfect – every point counts.

Although external evaluation is not required, when comparing language excerpted from the last three SSS competitions, it is clear that there is a much greater emphasis on the details of an evaluation plan. The guidelines require a detailed description of what types of data will be collected and how the applicant will use the information collected in the evaluation of project activities. It is no longer sufficient to just say “project staff will collect quantitative and qualitative data and use this information for project improvement.”

Our successful evaluation plans start with a detailed logic model, which allows us to make realistic projections of what we hope will happen and plan data collection around the project’s key activities and outcomes. We use these guiding questions to help formulate the details:

  • What services will be provided?
  • What can be measured?
    • perceptions, participation, academic progress
  • What information sources will be available?
  • What types of data will be collected?
    • student records, surveys, interviews, activity-specific data
  • How will we review and analyze the data collected?
  • What will we do with the findings?
    • Specific actions

Unlike universities, most community and state colleges are not hotbeds of research and evaluation. So what can grant writers do to prepare themselves to meet the “evaluation plan” challenge?

  • Make friends with a statistician; they tend to hang out in the Mathematics or Institutional Research departments.
  • Take a graduate level course in educational statistics. If you’re writing about something it is helpful to have rudimentary knowledge of what you write.
  • Find good resources. I have several textbook-like evaluation manuals, but my go-to, dog-eared guide for developing an evaluation plan is the National Science Foundation’s “2010 User-Friendly Handbook for Project Evaluation” (Logic Model information in Chapter 3).
  • An open-access list of Institutional Research (IR) Links located on the Association for Institutional Research website (AIR; a membership organization), provides more than 2200 links to external IR Web pages on a variety of topics related to data and decisions for higher education.
  • Community College Research Center (CCRC) resources, such as publications on prior research, can guide evaluation plan development (http://ccrc.tc.columbia.edu/). The CCRC FAQs Web page provides national data useful for benchmarking your grant program’s projected outcomes.