Archive: proposal development

Blog: Evaluating Educational Programs for the Future STEM Workforce: STELAR Center Resources

Posted on November 8, 2018 by  in Blog ()

Project Associate, STELAR Center, Education Development Center, Inc.

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Hello EvaluATE community! My name is Sarah MacGillivray, and I am a member of the STEM Learning and Research (STELAR) Center team, which supports the National Science Foundation Innovative Technology Experiences for Students and Teachers (NSF ITEST) program. Through ITEST, NSF funds the research and development of innovative models of engaging K-12 students in authentic STEM experiences. The goals of the program include building students’ interest and capacity to participate in STEM educational opportunities and developing the skills they will need for careers in STEM. While we target slightly different audiences than the Advanced Technological Education (ATE) program, our programs share the common goal of educating the future STEM workforce, and to support this goal, I invite you to access the many evaluation resources available on our website.

The STELAR website houses an extensive set of resources collected from and used by the ITEST community. These resources include a database of nearly 150 research and evaluation instruments. Each entry features a description of the tool, a list of relevant disciplines and topics, target participants, and a link to ITEST projects that have used the instrument in their work. Whenever possible, PDFs and/or URLs to the original resource are included, though some tools require a fee or membership to the third-party site for access. The instruments can be accessed at http://stelar.edc.org/resources/instruments, and the database can be searched or filtered by keywords common to ATE and ITEST projects, e.g., “participant recruitment and retention,” “partnerships and collaboration,” “STEM career opportunities and workforce development,” “STEM content and standards,” and “teacher professional development and pedagogy,” among others.

In addition to our extensive instrument library, our website also features more than 400 publications, curricular materials, and videos. Each library can be browsed individually, or if you would like to view everything that we have on a topic, you can search all resources on the main resources page: http://stelar.edc.org/resources. We are continually adding to our resources and have recently improved our collection methods to allow projects to upload to the website directly. We expect this will result in even more frequent additions, and we encourage you to visit often or join our mailing list for updates.

STELAR also hosts a free, self-paced online course in which novice NSF proposal writers develop a full NSF proposal. While focused on ITEST, the course can be generalized to any NSF proposal. Two sessions focus on research and evaluation, breaking down the process for developing impactful evaluations. Participants learn what key elements to include in research designs, how to develop logic models, what is involved in deciding the evaluation’s design, and how to align the research design and evaluation sections. The content draws from expertise within the STELAR team and elements from NSF’s Common Guidelines for Education Research and Development. Since the course is self-paced, you can learn more about the course and register to participate at any time: https://mailchi.mp/edc.org/invitation-itest-proposal-course-2

We hope that these resources are useful in your work and invite you to share suggestions and feedback with us at stelar@edc.org. As a member of the NSF Resource Centers network, we welcome opportunities to explore cross-program collaboration, working together to connect and promote our shared goals.

Checklist: ATE Evaluation Plan

Posted on August 21, 2018 by  in Resources ()

Updated August 2018!

This checklist provides information on what should be included in evaluation plans for proposals to the
National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. Grant seekers should carefully read the most recent ATE program solicitation (ATE Program Solicitation) for details about the program and proposal submission requirements.

ATE Evaluation Plan Checklist Field Test

EvaluATE invites individuals who are developing proposals for the National Science Foundation’s Advanced Technological Education (ATE) program to field test our updated ATE Evaluation Plan Checklist and provide feedback for improvement.

The field test version of the checklist is available below.

How to participate in the field test:
(1) Use the checklist while developing the evaluation plan for an ATE proposal.
(2) After you have completed your proposal, complete the brief feedback form.

After a few questions about the context of your work, this form will prompt you to answer four open-ended questions about your experience with the checklist:
• What was especially helpful about this checklist?
• What did you find confusing or especially difficult to apply?
• What would you add, change, or remove?
• If using this checklist affected the contents of your evaluation plan or your process for developing it, please describe how it influenced you.

Thank you for your assistance!

File: Click Here
Type: Checklist
Category: Proposal Development
Author(s): Lori Wingate

Webinar: Give Your Proposal A Competitive Edge with a Great Evaluation Plan

Posted on July 17, 2018 by , in Webinars ()

Presenter(s): Lori Wingate, Michael Lesiecki
Date(s): August 22, 2018
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/Y5FJooZ913w

A strong evaluation plan will give your proposal a competitive edge. In this webinar, we’ll explain the essential elements of an effective evaluation plan and show you how to incorporate them into a proposal for the National Science Foundation’s Advanced Technological Education program. We’ll also provide guidance on how to budget for an evaluation, locate a qualified evaluator, and use evaluative evidence to describe the results from prior NSF support (required if you’ve had previous NSF funding). Participants will receive an updated Evaluation Planning Checklist for ATE Proposals and other resources to help prepare strong evaluation plans.

Resources:
Slides
Webinar Questions Answered Post Event
ATE Evaluation Plan Checklist
ATE Evaluation Plan Template
Guide to Finding and Selecting an ATE Evaluator
ATE Evaluator Map
Evaluation Data Matrix
NSF Evaluator Biosketch Template
NSF ATE Program Solicitation

Blog: Evaluation Feedback Is a Gift

Posted on July 3, 2018 by  in Blog ()

Chemistry Faculty, Anoka-Ramsey Community College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Christopher Lutz, chemistry faculty at Anoka-Ramsey Community College. When our project was initially awarded, I was a first-time National Science Foundation (NSF) principal investigator. I understood external evaluation was required for grants but saw it as an administrative hurdle in the grant process. I viewed evaluation as proof for the NSF that we did the project and as a metric for outcomes. While both of these aspects are important, I learned evaluation is also an opportunity to monitor and improve your process and grant. Working with our excellent external evaluators, we built a stronger program in our grant project. You can too, if you are open to evaluation feedback.

Our evaluation team was composed of an excellent evaluator and a technical expert. I started working with both about halfway through the proposal development process (a few months before submission) to ensure they could contribute to the project. I recommend contacting evaluators during the initial stages of proposal development and checking in several times before submission. This gives adequate time for your evaluators to develop a quality evaluation plan and gives you time to understand how to incorporate your evaluator’s advice. Our funded project yielded great successes, but we could have saved time and achieved more if we had involved our evaluators earlier in the process.

After receiving funding, we convened grant personnel and evaluators for a face-to-face meeting to avoid wasted effort at the project start. Meeting in person allowed us to quickly collaborate on a deep level. For example, our project evaluator made real-time adjustments to the evaluation plan as our academic team and technical evaluator worked to plan our project videos and training tools. Include evaluator travel funds in your budget and possibly select an evaluator who is close by. We did not designate travel funds for our Kansas-based evaluator, but his ties to Minnesota and understanding of the value of face-to-face collaboration led him to use some of his evaluation salary to travel and meet with our team.

Here are three ways we used evaluation feedback to strengthen our project:

Example 1: The first-year evaluation report showed a perceived deficiency in the project’s provision of hands-on experience with MALDI-MS instrumentation. In response, we had students make small quantities of liquid solution instead of giving pre-mixed solutions, and let them analyze more lab samples. This change required minimal time but led students to regard the project’s hands-on nature as a strength in the second-year evaluation.

Example 2: Another area for improvement was students’ lack of confidence in analyzing data. In response to this feedback, project staff create Excel data analysis tools and a new training activity for students to practice with literature data prior to analyzing their own. The subsequent year’s evaluation report indicated increased student confidence.

Example 3: Input from our technical evaluator allowed us to create videos that have been used in academic institutions in at least three US states, the UK’s Open University system, and Iceland.

Provided here are some overall tips:

  1. Work with your evaluator(s) early in the proposal process to avoid wasted effort.
  2. Build in at least one face-to-face meeting with your evaluator(s).

Review evaluation data and reports with the goal of improving your project in the next year.

Consider external evaluators as critical friends who are there to help improve your project. This will help move your project forward and help you have a greater impact for all.

Template: ATE Proposal Evaluation Plan

Posted on July 13, 2017 by  in Resources ()

This template is for use in preparing the evaluation plan sections for proposals to the National Science Foundation’s Advanced Technological Education (ATE) program. It is based the ATE Evaluation Planning Checklist, also developed by EvaluATE. It is aligned with the evaluation guidance included in the 2017 ATE Program Solicitation. All proposers should read the solicitation in full.

File: Click Here
Type: Worksheet
Category: Resources
Author(s): Lori Wingate

Resource: Finding and Selecting an Evaluator for Advanced Technological Education (ATE) Proposals

Posted on July 13, 2017 by  in Resources ()

All ATE proposals are required to request “funds to support an evaluator independent of the project.” Ideally, this external evaluator should be identified in the project proposal. The information in this guide is for individuals who are able to select and work with an external evaluator at the proposal stage. However, some institutions prohibit selecting an evaluator on a noncompetitive basis in advance of an award being made. Advice for individuals in that situation is provided in an EvaluATE blog and newsletter article.

This guide includes advice on how to locate and select an external evaluator. It is not intended as a guide for developing an evaluation plan or contracting with an evaluator.

File: Click Here
Type: Doc
Category: Resources
Author(s): Lori Wingate

Blog: Three Tips for a Strong NSF Proposal Evaluation Plan

Posted on August 17, 2016 by  in Blog ()

Principal Research Scientist, Education Development Center, Inc.

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Leslie Goodyear and I’m an evaluator who also served as a program officer for three years at the National Science Foundation in the Division of Research on Learning, which is in the Education and Human Resources Directorate. While I was there, I oversaw evaluation activities in the Division and reviewed many, many evaluation proposals and grant proposals with evaluation sections.

In May 2016, I had the pleasure of participating in the “Meeting Requirements, Exceeding Expectations: Understanding the Role of Evaluation in Federal Grants.” Hosted by Lori Wingate at EvaluATE and Ann Beheler at the Centers Collaborative for Technical Assistance, this webinar covered topics such as evaluation fundamentals; evaluation requirements and expectations; and evaluation staffing, budgeting and utilization.

On the webinar, I shared my perspective on the role of evaluation at NSF, strengths and weaknesses of evaluation plans in proposals, and how reviewers assess Results from Prior NSF Support sections of proposals, among other topics. In this blog, I’ll give a brief overview of some important takeaways from the webinar.

First, if you’re making a proposal to education or outreach programs, you’ll likely need to include some form of project evaluation in your proposal. Be sure to read the program solicitation carefully to know what the specific requirements are for that program. There are no agency-wide evaluation requirements—instead they are specified in each solicitation. Lori had a great suggestion on the webinar:  Search the solicitation for “eval” to make sure you find all the evaluation-related details.

Second, you’ll want to make sure that your evaluation plan is tailored to your proposed activities and outcomes. NSF reviewers and program officers can smell a “cookie cutter” evaluation plan, so make sure that you’ve talked with your evaluator while developing your proposal and that they’ve had the chance to read the goals and objectives of your proposed work before drafting the plan. You want the plan to be incorporated into the proposal so that it appears seamless.

Third, indicators of a strong evaluation plan include carefully crafted, relevant overall evaluation questions, a thoughtful project logic model, a detailed data collection plan that is coordinated with project activities, and a plan for reporting and dissemination of findings. You’ll also want to include a bio for your evaluator so that the reviewers know who’s on your team and what makes them uniquely qualified to carry out the evaluation of your project.

Additions that can make your plan “pop” include:

  • A table that maps out the evaluation questions to the data collection plans. This can save space by conveying lots of information in a table instead of in narrative.
  • Combining the evaluation and project timelines so that the reviewers can see how the evaluation will be coordinated with the project and offer timely feedback.

Some programs allow for using the Supplemental Documents section for additional evaluation information. Remember that reviewers are not required to read these supplemental docs, so be sure that the important information is still in the 15-page proposal.

For the Results of Prior NSF Support section, you want to be brief and outcome-focused. Use this space to describe what resulted from the prior work, not what you did. And be sure to be clear how that work is informing the proposed work by suggesting, for example, that these outcomes set up the questions you’re pursuing in this proposal.

Newsletter: Revisiting Intellectual Merit and Broader Impact

Posted on January 1, 2016 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

If you have ever written a proposal to the National Science Foundation (NSF) or participated in a proposal review panel for NSF, you probably instantly recognize the terms Intellectual Merit and Broader Impacts as NSF’s merit review criteria. Proposals are rated and funding decisions are made based on how well they address these criteria. Therefore, proposers must describe the potential of their proposed work to advance knowledge and understanding (Intellectual Merit) and benefit society (Broader Impacts).

Like cramming for an exam and then forgetting 90 percent of what you memorized, it’s all too easy for principal investigators to lose sight of Intellectual Merit and Broader Impacts after proposal submission. But there are two important reasons to maintain focus on Intellectual Merit and Broader Impacts after an award is made and throughout project implementation.

First, the goals and activities expressed in a proposal are commitments about how a particular project will advance knowledge (Intellectual Merit) and bring tangible benefits to individuals, institutions, communities, and/or our nation (Broader Impacts). Simply put, PIs have an ethical obligation to follow through on these commitments to the best of their abilities.

Second, when funded PIs seek subsequent grants from NSF, they must describe the results of their prior NSF funding in terms of Intellectual Merit and Broader Impacts. In other words, proposers must explain how they used their NSF funding to actually advance knowledge and understanding and benefit society. PIs who have evidence of their accomplishments in these areas and can convey it succinctly will be well-positioned to seek additional funding. To ensure evidence of both Intellectual Merit and Broader Impacts are being captured, PIs should revisit project evaluation plans with their evaluators, crosschecking the proposal’s claims about potential Intellectual Merit and Broader Impacts in relation to the evaluation questions and data collection plan to make sure compelling evidence is captured.

Last October, I conducted a workshop on this topic at the ATE Principal Investigators Conference with colleague Kirk Knestis, an evaluator from Hezel Associates. Dr. Celeste Carter, ATE program co-lead, spoke about how to frame results of prior NSF support in proposals. She noted that a common misstep that she has seen in proposals is when proposers speak to results from prior support by simply reiterating what they said they were going to do in their funded proposals, rather than describing the actual outcomes of the grant. Project summaries (one-page descriptions that address a proposed project’s Intellectual Merit and Broader Impacts that are required as part of all NSF proposals) are necessarily written in a prospective, future-oriented manner because the work hasn’t been initiated yet. In contrast, the Results of Prior NSF Support sections are about completed work and therefore are written in past tense and should include evidence of accomplishments. Describing achievements and presenting evidence of the quality and impact of those achievements shows reviewers that the proposer is a responsible steward of federal funds, can deliver on promises, and is building on prior success.

Take time now, well before it is time to submit a new proposal or a Project Outcomes Report, to make sure you haven’t lost sight of the Intellectual Merit and Broader Impact aspects of your grant and how you promised to contribute to these national priorities.