Archive: evaluation planning

Blog: Building Research-Practice Collaborations for Effective STEM + Computing Education Evaluation Design

Posted on November 29, 2018 by  in Blog ()

Director of Measurement, Evaluation, and Learning, Kapor Center

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

 

At the Kapor Center, our signature three-summer educational program (SMASH Academy) aims to prepare underrepresented high school students of color to pursue careers in science, technology, education, and mathematics (STEM) and computing through access to courses, support networks, and opportunities for social and personal development.

In the nonprofit sector, evaluations can be driven by funder requirements, which often focus on outcomes. However, by solely focusing on outcomes, teams can lose sight of the goal of STEM evaluation: to inform programming (through the creation of process evaluation tools such as observation protocols and course evaluations) to ensure youth of color are prepared for the future STEM economy.

To keep that goal in focus, the Kapor Center ensures that the evaluation method driving its work is utilization-focused evaluation. Utilization-focused evaluation begins with the premise that the success metric of an evaluation is the extent to which it is used by key stakeholders (Patton, 2008). This framework requires joint decision making between the evaluator and stakeholders to determine the purpose of the evaluation, the kind of data to be collected, the type of evaluation design to be created, and the uses of the evaluation. Using this framework shifts evaluation from a linear, top-down approach to a feedback loop involving practitioners.

Figure 1. Evaluation Cycle of SMASH Academy

The evaluation cycle at the Kapor Center, a collaboration between our research team and SMASH’s program team, is outlined below:

  1. Inquiry: This stage begins with conversations with the stakeholders (e.g., programs and leadership teams) about common understandings of short-, medium-, and long-term outcomes as well as the key strategies that drive outcomes. Delineating outcomes has been integral to working transparently toward program priorities.
  2. Instrument Development: Once groups are in agreement about the goal of the evaluation and our path to it, we develop instruments. Instrument mapping, linking each tool and question to specific outcomes, has been a good practice to open the communication channels among teams.
  3. Instrument Administration: When working with seasonal staff at the helm of evaluation administration, documentation of processes has been crucial for fidelity. Not surprisingly, with varying levels of experience among program staff, the creation of systems to standardize data collection has been key, including scoring rubrics to be used during observations and guides for survey administration.

Data Analysis and Reporting: When synthesizing data, analyses and reporting need to not only tell a broad impact story but also provide concrete targets and priorities for the program

  1. In this regard, analyses have encompassed pre-post outcome differences and reports on program experiences.
  2. Reflection and Integration: At the end of the program cycle, the program team reflects on the data together to inform their path forward. In such a meeting, the team engages in answering three questions: 1) What did you observe about the data? 2) What can you infer about the data and what evidence supports your inference? and 3) What are the next steps to develop and prioritize program modifications?

Developing stronger research-practice ties have been integral to the Kapor Center’s understanding of what works, for whom, and under what context to ensure more youth of color pursue and persist in STEM fields. Beyond the SMASH program, the practice of collective cooperation between researchers and practitioners provides an opportunity to impact strategies across the field.

 

References

Patton, M. Q. (2008). Utilization-focused evaluation. Newbury Park, CA: Sage.

 

Checklist: ATE Evaluation Plan

Posted on August 21, 2018 by  in Resources ()

Updated August 2018!

This checklist provides information on what should be included in evaluation plans for proposals to the
National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. Grant seekers should carefully read the most recent ATE program solicitation (ATE Program Solicitation) for details about the program and proposal submission requirements.

ATE Evaluation Plan Checklist Field Test

EvaluATE invites individuals who are developing proposals for the National Science Foundation’s Advanced Technological Education (ATE) program to field test our updated ATE Evaluation Plan Checklist and provide feedback for improvement.

The field test version of the checklist is available below.

How to participate in the field test:
(1) Use the checklist while developing the evaluation plan for an ATE proposal.
(2) After you have completed your proposal, complete the brief feedback form.

After a few questions about the context of your work, this form will prompt you to answer four open-ended questions about your experience with the checklist:
• What was especially helpful about this checklist?
• What did you find confusing or especially difficult to apply?
• What would you add, change, or remove?
• If using this checklist affected the contents of your evaluation plan or your process for developing it, please describe how it influenced you.

Thank you for your assistance!

File: Click Here
Type: Checklist
Category: Proposal Development
Author(s): Lori Wingate

Worksheet: Evaluation Data Matrix Template

Posted on August 16, 2018 by  in Resources (, )

An evaluation plan should include a clear description of what data will be collected, from what sources and how, by whom, and when, as well as how the data will be analyzed. Placing this information in a matrix helps ensure that there is a viable plan for collecting all the data necessary to answer each evaluation question and that all collected data will serve a specific, intended purpose. The table below may be copied into another document, such as a grant proposal, and edited/ expanded as needed. An example is provided on the next page.

File: Click Here
Type: Doc
Category: Data…, Evaluation Design
Author(s): Lori Wingate

Evaluation Process

Posted on March 14, 2018 by , in Resources ()

Highlights the four main steps of an ATE Evaluation, and provides detailed activities for each step. This example is an excerpt from the Evaluation Basics for Non-evaluators webinar. Access slides, recording, handout, and additional resources from bit.ly/mar18-webinar.

File: Click Here
Type: Doc
Category: Getting Started
Author(s): Emma Perk, Lori Wingate

Checklist: Communication Plan for ATE Principal Investigators and Evaluators

Posted on October 17, 2017 by , in Resources (, )

Creating a clear communication plan at the beginning of an evaluation can help project personnel and evaluators avoid confusion, misunderstandings, or uncertainty. The communication plan should be an agreement between the project’s principal investigator and the evaluator, and followed by members of their respective teams. This checklist highlights the decisions that need to made when developing a clear communication plan.

File: Click Here
Type: Checklist
Category: Checklist, Evaluation Design
Author(s): Lori Wingate, Lyssa Becho

Webinar: Evaluation: All the Funded ATE Proposals Are Doing It

Posted on August 10, 2017 by , in Webinars ()

Presenter(s): Lori Wingate, Mike Lesiecki
Date(s): August 16, 2017
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/7ytTEGt_FoM

Give your proposal a competitive edge with a strong evaluation plan. The National Science Foundation has issued a new solicitation for its Advanced Technological Education (ATE) program. It includes major changes to the guidelines for ATE evaluation plans. Attend this webinar to learn the key elements of a winning evaluation plan and strategies for demonstrating to reviewers that evaluation is an integral part of your project, not an afterthought. In addition, we’ll provide you with specific guidance for how to budget for an evaluation, locate a qualified evaluator, and describe results from prior NSF support with supporting evaluative evidence. You will receive an updated and other tools to help prepare strong evaluation plans.

Resources:
Slides
ATE Proposal Evaluation Plan Template
Data Collection Planning Matrix
Evaluator Biographical Sketch Template for National Science Foundation (NSF) Proposals
Evaluation Planning Checklist for ATE Proposals
Evaluation Questions Checklist for Program Evaluation
Guide to Finding and Selecting an Evaluator
Logic Models: Getting them Right and Using them Well [webinar]
Logic Model Template for ATE Projects and Centers
NSF Prior Support Checklist
Small-Scale Evaluation Webinar

Blog: Integrating Perspectives for a Quality Evaluation Design

Posted on August 2, 2017 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
John Dorris

Director of Evaluation and Assessment, NC State Industry Expansion Solutions

Dominick Stephenson

Assistant Director of Research Development and Evaluation, NC State Industry Expansion Solutions

Designing a rigorous and informative evaluation depends on communication with program staff to understand planned activities and how those activities relate to the program sponsor’s objectives and the evaluation questions that reflect those objectives (see white paper related to communication). At NC State Industry Expansion Solutions, we have worked long enough on evaluation projects to know that such communication is not always easy because program staff and the program sponsor often look at the program from two different perspectives: The program staff focus on work plan activities (WPAs), while the program sponsor may be more focused on the evaluation questions (EQs). So, to help facilitate communication at the beginning of the evaluation project and assist in the design and implementation, we developed a simple matrix technique to link the WPAs and the EQs (see below).

Click to enlarge

For each of the WPAs, we link one or more EQs and indicate what types of data collection events will take place during the evaluation. During project planning and management, the crosswalk of WPAs and EQs will be used to plan out qualitative and quantitative data collection events.

Click to enlarge

The above framework may be more helpful with the formative assessment (process questions and activities). However, it can also enrich the knowledge gained by the participant outcomes analysis in the summative evaluation in the following ways:

Understanding how the program has been implemented will help determine fidelity to the program as planned, which will help determine the degree to which participant outcomes can be attributed to the program design.
Details on program implementation that are gathered during the formative assessment, when combined with evaluation of participant outcomes, can suggest hypotheses regarding factors that would lead to program success (positive participant outcomes) if the program is continued or replicated.
Details regarding the data collection process that are gathered during the formative assessment will help assess the quality and limitations of the participant outcome data, and the reliability of any conclusions based on that data.

So, for us this matrix approach is a quality-check on our evaluation design that also helps during implementation. Maybe you will find it helpful, too.

Template: ATE Proposal Evaluation Plan

Posted on July 13, 2017 by  in Resources ()

This template is for use in preparing the evaluation plan sections for proposals to the National Science Foundation’s Advanced Technological Education (ATE) program. It is based the ATE Evaluation Planning Checklist, also developed by EvaluATE. It is aligned with the evaluation guidance included in the 2017 ATE Program Solicitation. All proposers should read the solicitation in full.

File: Click Here
Type: Worksheet
Category: Resources
Author(s): Lori Wingate

Video: Introduction to Evaluation for Mentor-Connect Cohort 2017

Posted on April 25, 2017 by  in Videos ()

This video was created for the 2017 Mentor-Connect Cohort, but can be applicable to others interested in learning about ATE Evaluation. Specifically this video provides an overview of What is project evaluation?, Why does NSF require evaluation?, How do you plan for evaluation?, and How can EvaluATE help?

File: Click Here
Type: Video
Category: video
Author(s): Lori Wingate