Lori Wingate

Director of Research, The Evaluation Center at Western Michigan University

Lori has a Ph.D. in evaluation and more than 20 years of experience in the field of program evaluation. She directs EvaluATE and leads and a variety of evaluation projects at WMU focused on STEM education, health, and higher education initiatives. Dr. Wingate has led numerous webinars and workshops on evaluation in a variety of contexts, including CDC University and the American Evaluation Association Summer Evaluation Institute. She is an associate member of the graduate faculty at WMU.


Report: Final ATE Evaluation Report (2006)

Posted on May 14, 2019 by , , , in Report Archive ()

This report describes the basis from which the ATE program was created and conducted and the evaluation work that has shadowed this program for the past seven years. It traces the program’s work and reach to community colleges and others since the beginning of the ATE program. It analyzes ATE solicitations to show linkages between the program guidelines and program productivity and then describes this evaluation’s design and data collection methods to show why and how evaluative data were collected. The following evaluation findings both describe and judge the program in various respects.

Findings from the evaluation show that the program is healthy and well run. Nearly a fifth of the nation’s two-year colleges have been funded at least once by this program, and those funds have resulted in substantial productivity in funded and collaborating institutions and organizations. Major strengths of this program are evident in its materials development, professional development, and program improvement products. Large numbers of students and teachers have participated in this program—taking courses and graduating or otherwise being certified. Business and industry have collaborated with colleges in developing and conducting these programs with perceived substantial benefits from that involvement.

Multiple strands of evaluative information describe and confirm that the program produces important outcomes of good quality. Though consistently positive, these findings are highly dependent on testimony/feedback as a primary quality assurance mechanism. We believe additional project/center-based direct evidence of program effectiveness and quality would strengthen claims of quality and provide important information for program improvement. Suggestions are made that we believe will improve the ATE program; these suggestions are viewed as small changes designed for incremental improvement.

File: Click Here
Type: Report
Category: ATE Research & Evaluation
Author(s): Arlen Gullickson, Chris Coryn, Frances Lawrenz, Lori Wingate

Webinar: Outcomes Evaluation: Step-by-Step

Posted on March 12, 2019 by  in Webinars ()

Presenter(s): Lori Wingate, Mike Lesiecki
Date(s): March 21, 2019
Time: 1:00-2:00 p.m. EASTERN
Recording: https://youtu.be/Sva5JIj5CE4

Bonus webinar! Join EvaluATE for one of our most popular webinars. Register today to save your seat and get ready to learn a lot. This is not an event you want to miss.

Outcome evaluation involves identifying and measuring the changes that occur as a result of project implementation. These changes may occur at the individual, organizational, or community levels and include changes in knowledge, skills, attitudes, behavior, and community/societal conditions. All too often, however, evaluations focus on project activities, rather than meaningful changes it helped bring about. Webinar participants will learn how to identify appropriate outcomes to assess in an evaluation and how to use those outcomes as a foundation for planning data collection, analysis, and interpretation.

This webinar is being presented in partnership with

 

Resources:
Slides
Handout

Report: 2018 ATE Annual Survey

Posted on February 1, 2019 by , in Annual Survey ()

This report summarizes data gathered in the 2018 survey of ATE program grantees. Conducted by EvaluATE — the evaluation support center for the ATE program, located at The Evaluation Center at Western Michigan University — this was the 19th annual ATE survey. Included here are findings about ATE projects and the activities, accomplishments, and impacts of the projects during the 2017 calendar year (2017 fiscal year for budget-related questions).

File: Click Here
Type: Report
Category: ATE Annual Survey
Author(s): Lori Wingate, Lyssa Becho

Webinar: Basic Principles of Survey Question Development

Posted on January 30, 2019 by , in Webinars ()

Presenter(s): Lori Wingate, Lyssa Wilson Becho, Mike Lesiecki
Date(s): February 20, 2019
Time: 1:00-2:00 p.m. EASTERN
Recording: https://youtu.be/64nXDeRm-9c

Surveys are a valuable source of evaluation data. Obtaining quality data relies heavily on well-crafted survey items that align with the overall purpose of the evaluation. In this webinar, participants will learn fundamental principles of survey question construction to enhance the validity and utility of survey data. We will discuss the importance of considering data analysis during survey construction and ways to test your survey questions. Participants will receive an overview of survey do’s and don’ts to help apply fundamental principles of survey question development in their own work.

Resources:
Slides
Handout

Webinar: Three Common Evaluation Fails and How to Prevent Them

Posted on December 4, 2018 by , in Webinars

Presenter(s): Kirk Knestis, Lori Wingate, Mike Lesiecki
Date(s): January 30, 2019
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/u1u2DssdLHc

In this webinar, experienced STEM education evaluator Kirk Knestis will share strategies for effectively communicating with evaluation clients to avoid three common “evaluation fails.” (1) Project implementation delays; (2) evaluation scope creep (clients wanting something more or different from what was originally planned); and (3) substantial changes in the project over the course of the evaluation. These issues are typical causes for an evaluation to be derailed and fail to produce useful and valid results. Webinar participants will learn how clear documentation—specifically, an evaluation contract (legal commitment to the work), scope of work (detailed description of evaluation services and deliverables), and study protocol (technical details concerning data collection and analysis)—can make potentially difficult conversations go better for all involved, averting potential evaluation crises and failures. Getting these documents right and using them in project communications helps ensure a smoothly operating evaluation, happy client, and profitable project for the evaluator

For a sneak peek of some of what Kirk will address in this webinar, see his blogpost, http://www.evalu-ate.org/blog/knestis-apr18/.

Resources:
Study Protocol Template
Evaluation Scope Template
Slides

Checklist: ATE Evaluation Plan

Posted on August 21, 2018 by  in

Updated August 2018!

This checklist provides information on what should be included in evaluation plans for proposals to the
National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. Grant seekers should carefully read the most recent ATE program solicitation (ATE Program Solicitation) for details about the program and proposal submission requirements.

ATE Evaluation Plan Checklist Field Test

EvaluATE invites individuals who are developing proposals for the National Science Foundation’s Advanced Technological Education (ATE) program to field test our updated ATE Evaluation Plan Checklist and provide feedback for improvement.

The field test version of the checklist is available below.

How to participate in the field test:
(1) Use the checklist while developing the evaluation plan for an ATE proposal.
(2) After you have completed your proposal, complete the brief feedback form.

After a few questions about the context of your work, this form will prompt you to answer four open-ended questions about your experience with the checklist:
• What was especially helpful about this checklist?
• What did you find confusing or especially difficult to apply?
• What would you add, change, or remove?
• If using this checklist affected the contents of your evaluation plan or your process for developing it, please describe how it influenced you.

Thank you for your assistance!

File: Click Here
Type: Checklist
Category: Proposal Development
Author(s): Lori Wingate

Blog: The Life-Changing Magic of a Tidy Evaluation Plan

Posted on August 16, 2018 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

“Effective tidying involves only three essential actions. All you need to do is take the time to examine every item you own, decide whether or not you want to keep it, then choose where to put what you keep. Designate a place for each thing.”

―Marie Kondo, The Life-Changing Magic of Tidying Up

I’ve noticed a common problem with some proposal evaluation plans: It’s not so much that they don’t include key information; it’s that they lack order. They’re messy. When you have only about two pages of a 15-page National Science Foundation proposal to describe an evaluation, you need to be exceptionally clear and efficient. In this blog, I offer tips on how to “tidy up” your proposal’s evaluation plan to ensure it communicates key information clearly and coherently.

First of all, what does a messy evaluation plan look like? It meanders. It frames the evaluation’s focus in different ways in different places in the proposal, or even within the evaluation section itself, leaving the reviewer confused about the evaluation’s purpose. It discusses data and data collection without indicating what those data will be used to address. It employs different terms to mean the same thing in different places. It makes it hard for reviewers to discern key information from the evaluation plan and understand how that information fits together.

Three Steps to Tidy up a Messy Evaluation Plan

It’s actually pretty easy to convert a messy evaluation plan into a tidy one:

  • State the evaluation’s focus succinctly. List three to seven evaluation questions that the evaluation will address. These questions should encompass all of your planned data collection and analysis—no more, no less. Refer to these as needed later in the plan, rather than restating them differently or introducing new topics later in the plan. Do not express the evaluation’s focus in different ways in different places.
  • Link the data you plan to collect to the evaluation questions. An efficient way to do this is to present the information in a table. I like to include evaluation questions, indicators, data collection methods and sources, analysis, and interpretation in a single table to clearly show the linkages and convey that my team has carefully thought about how we will answer the evaluation questions. Bonus: Presenting information in a table saves space and makes it easy for reviewers to locate key information. (See EvaluATE’s Evaluation Data Matrix Template.)
  • Use straightforward language—consistently. Don’t assume that reviewers will share your definition of evaluation-related terms. Choose your terms carefully and do not vary how you use them throughout the proposal. For example, if you are using the terms measures, metrics, and indicators, ask yourself if you are really referring to different things. If not, stick with one term and use it consistently. If similar words are actually intended to mean different things, include brief definitions to avoid any confusion about your meaning.

Can a Tidy Evaluation Plan Really Change Your Life?

If it moves a very good proposal toward excellent, then yes! In the competitive world of grant funding, every incremental improvement counts and heightens your chances for funding, which can mean life-changing opportunities for the project leaders, evaluators, and—most importantly—individuals who will be served by the project.

Worksheet: Evaluation Data Matrix Template

Posted on August 16, 2018 by  in ,

An evaluation plan should include a clear description of what data will be collected, from what sources and how, by whom, and when, as well as how the data will be analyzed. Placing this information in a matrix helps ensure that there is a viable plan for collecting all the data necessary to answer each evaluation question and that all collected data will serve a specific, intended purpose. The table below may be copied into another document, such as a grant proposal, and edited/ expanded as needed. An example is provided on the next page.

File: Click Here
Type: Doc
Category: Data…, Evaluation Design
Author(s): Lori Wingate

Webinar: Give Your Proposal A Competitive Edge with a Great Evaluation Plan

Posted on July 17, 2018 by , in Webinars ()

Presenter(s): Lori Wingate, Michael Lesiecki
Date(s): August 22, 2018
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/Y5FJooZ913w

A strong evaluation plan will give your proposal a competitive edge. In this webinar, we’ll explain the essential elements of an effective evaluation plan and show you how to incorporate them into a proposal for the National Science Foundation’s Advanced Technological Education program. We’ll also provide guidance on how to budget for an evaluation, locate a qualified evaluator, and use evaluative evidence to describe the results from prior NSF support (required if you’ve had previous NSF funding). Participants will receive an updated Evaluation Planning Checklist for ATE Proposals and other resources to help prepare strong evaluation plans.

Resources:
Slides
Webinar Questions Answered Post Event
ATE Evaluation Plan Checklist
ATE Evaluation Plan Template
Guide to Finding and Selecting an ATE Evaluator
ATE Evaluator Map
Evaluation Data Matrix
NSF Evaluator Biosketch Template
NSF ATE Program Solicitation