Lori Wingate

Director of Research, The Evaluation Center at Western Michigan University

Lori has a Ph.D. in evaluation and more than 20 years of experience in the field of program evaluation. She directs EvaluATE and leads and a variety of evaluation projects at WMU focused on STEM education, health, and higher education initiatives. Dr. Wingate has led numerous webinars and workshops on evaluation in a variety of contexts, including CDC University and the American Evaluation Association Summer Evaluation Institute. She is an associate member of the graduate faculty at WMU.


Blog: How Can You Make Sure Your Evaluation Meets the Needs of Multiple Stakeholders?*

Posted on October 31, 2019 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

We talk a lot about stakeholders in evaluation. These are the folks who are involved in, affected by, or simply interested in the evaluation of your project. But what these stakeholders want or need to know from the evaluation, the time they have available for the evaluation, and their level of interest are probably quite variable. The table below is a generic guide to the types of ATE evaluation stakeholders, what they might need, and how to meet those needs.

ATE Evaluation Stakeholders

Stakeholder groups What they might need Tips for meeting those needs
Project leaders (PI, co-PIs) Information that will help you improve the project as it unfolds

Results you can include in your annual reports to NSF to demonstrate accountability and impact

Communicate your needs clearly to your evaluator, including when you need the information in order to make use of it.
Advisory committees or National Visiting Committees Results from the evaluation that show whether the project is on track for meeting its goals, and if changes in direction or operations are warranted

Summary information about the project’s strengths and weaknesses

Many advisory committee members donate their time, so they probably aren’t interested in reading lengthy reports. Provide a brief memo and/or short presentation with key findings at meetings, and invite questions about the evaluation. Be forthcoming about strengths and weaknesses.
Participants who provide data for the evaluation Access to reports in which their information was used

Summaries of what actions were taken based on the information they needed to provide

The most important thing for this group is to demonstrate use of the information they provided. You can share reports, but a personal message from project leaders along the lines of “we heard you and here is what we’re doing in response” is most valuable.
NSF program officers Evidence that the project is on track to meet its goals

Evidence of impact (not just what was done, but what difference the work is making)

Evidence that the project is using evaluation results to make improvements

Focus on Intellectual Merit (the intrinsic quality of the work and potential to advance knowledge) and Broader Impacts (the tangible benefits for individuals and progress toward desired societal outcomes). If you’re not sure about what your program officer needs from your evaluation, ask for clarification.
College administrators (department chairs, deans, executives, etc.) Results that demonstrate impact on students, faculty, institutional culture, infrastructure, and reputation Make full reports available upon request, but most busy administrators probably don’t have the time to read technical reports or don’t need the fine-grained data points. Prepare memos or share presentations that focus on the information they’re most interested in.
Partners and collaborators Information that helps them assess the return on the investment of their time or other resources

In case you didn’t read between the lines, the underlying message here is to provide stakeholders with the information that is most relevant to their particular “stake” in your project. A good way not to meet their needs is to only send everyone a long, detailed technical report with every data point collected. It’s good to have a full report available for those who request it, but many simply won’t have the time or level of interest needed to consume that quantity of evaluative information about your project.

Most importantly, don’t take our word about what your stakeholders might need: Ask them!

Not sure what stakeholders to involve in your evaluation or how? Check out our worksheet Identifying Stakeholders and Their Roles in an Evaluation at bit.ly/id-stake.

 

*This blog is a reprint of an article from an EvaluATE newsletter published in October 2015.

Checklist: Evaluation Plan for ATE Proposals

Posted on July 19, 2019 by  in

Updated July 2019!

This checklist provides information on what should be included in evaluation plans for proposals to the
National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. Grant seekers should carefully read the most recent ATE program solicitation (ATE Program Solicitation) for details about the program and proposal submission requirements.

File: Click Here
Type: Checklist
Category: Proposal Development
Author(s): Lori Wingate

Blog: An Evaluative Approach to Proposal Development*

Posted on June 27, 2019 by  in Blog - ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

A student came into my office to ask me a question. Soon after she launched into her query, I stopped her and said I wasn’t the right person to help because she was asking about a statistical method that I wasn’t up-to-date on. She said, “Oh, you’re a qualitative person?” And I answered, “Not really.” She left looking puzzled. The exchange left me pondering the vexing question, “What am I?” (Now imagine these words echoing off my office walls in a spooky voice for a couple of minutes.) After a few uncomfortable moments, I proudly concluded, “I am a critical thinker!”  

Yes, evaluators are trained specialists with an arsenal of tools, strategies, and approaches for data collection, analysis, and reporting. But critical thinking—evaluative thinking—is really what drives good evaluation. In fact, the very definition of critical thinking—“the mental process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion”2—describes the evaluation process to a T. Applying your critical, evaluative thinking skills in developing your funding proposal will go a long way toward ensuring your submission is competitive.

Make sure all the pieces of your proposal fit together like a snug puzzle. Your proposal needs both a clear statement of the need for your project and a description of the intended outcomes—make sure these match up. If you struggle with the outcome measurement aspect of your evaluation plan, go back to the rationale for your project. If you can observe a need or problem in your context, you should be able to observe the improvements as well.

Be logical. Develop a logic model to portray how your project will translate its resources into outcomes that address a need in your context. Sometimes simply putting things in a graphic format can reveal shortcomings in a project’s logical foundation (like when important outcomes can’t be tracked back to planned activities). The narrative description of your project’s goals, objectives, deliverables, and activities should match the logic model.

Be skeptical. Project planning and logic model development typically happen from an optimistic point of view. (“If we build it, they will come.”) When creating your work plan, step back from time to time and ask yourself and your colleagues, What obstacles might we face? What could really mess things up? Where are the opportunities for failure? And perhaps most important, ask, Is this really the best solution to the need we’re trying to address? Identify your plan’s weaknesses and build in safeguards against those threats. I’m all for an optimistic outlook, but proposal reviewers won’t be wearing rose-colored glasses when they critique your proposal and compare it with others written by smart people with great ideas, just like you. Be your own worst critic and your proposal will be stronger for it.

Evaluative thinking doesn’t replace specialized training in evaluation. But even the best evaluator and most rigorous evaluation plan cannot compensate for a disheveled, poorly crafted project plan. Give your proposal a competitive edge by applying your critical thinking skills and infusing an evaluative perspective throughout your project description.

* This blog is a reprint of an article from an EvaluATE newsletter published in summer 2015.

2 dictionary.com

Report: Final ATE Evaluation Report (2006)

Posted on May 14, 2019 by , , , in Report Archive ()

This report describes the basis from which the ATE program was created and conducted and the evaluation work that has shadowed this program for the past seven years. It traces the program’s work and reach to community colleges and others since the beginning of the ATE program. It analyzes ATE solicitations to show linkages between the program guidelines and program productivity and then describes this evaluation’s design and data collection methods to show why and how evaluative data were collected. The following evaluation findings both describe and judge the program in various respects.

Findings from the evaluation show that the program is healthy and well run. Nearly a fifth of the nation’s two-year colleges have been funded at least once by this program, and those funds have resulted in substantial productivity in funded and collaborating institutions and organizations. Major strengths of this program are evident in its materials development, professional development, and program improvement products. Large numbers of students and teachers have participated in this program—taking courses and graduating or otherwise being certified. Business and industry have collaborated with colleges in developing and conducting these programs with perceived substantial benefits from that involvement.

Multiple strands of evaluative information describe and confirm that the program produces important outcomes of good quality. Though consistently positive, these findings are highly dependent on testimony/feedback as a primary quality assurance mechanism. We believe additional project/center-based direct evidence of program effectiveness and quality would strengthen claims of quality and provide important information for program improvement. Suggestions are made that we believe will improve the ATE program; these suggestions are viewed as small changes designed for incremental improvement.

File: Click Here
Type: Report
Category: ATE Research & Evaluation
Author(s): Arlen Gullickson, Chris Coryn, Frances Lawrenz, Lori Wingate

Webinar: Outcomes Evaluation: Step-by-Step

Posted on March 12, 2019 by  in Webinars ()

Presenter(s): Lori Wingate, Mike Lesiecki
Date(s): March 21, 2019
Time: 1:00-2:00 p.m. EASTERN
Recording: https://youtu.be/Sva5JIj5CE4

Bonus webinar! Join EvaluATE for one of our most popular webinars. Register today to save your seat and get ready to learn a lot. This is not an event you want to miss.

Outcome evaluation involves identifying and measuring the changes that occur as a result of project implementation. These changes may occur at the individual, organizational, or community levels and include changes in knowledge, skills, attitudes, behavior, and community/societal conditions. All too often, however, evaluations focus on project activities, rather than meaningful changes it helped bring about. Webinar participants will learn how to identify appropriate outcomes to assess in an evaluation and how to use those outcomes as a foundation for planning data collection, analysis, and interpretation.

This webinar is being presented in partnership with

 

Resources:
Slides
Handout

Report: 2018 ATE Annual Survey

Posted on February 1, 2019 by , in Annual Survey ()

This report summarizes data gathered in the 2018 survey of ATE program grantees. Conducted by EvaluATE — the evaluation support center for the ATE program, located at The Evaluation Center at Western Michigan University — this was the 19th annual ATE survey. Included here are findings about ATE projects and the activities, accomplishments, and impacts of the projects during the 2017 calendar year (2017 fiscal year for budget-related questions).

File: Click Here
Type: Report
Category: ATE Annual Survey
Author(s): Lori Wingate, Lyssa Becho

Webinar: Basic Principles of Survey Question Development

Posted on January 30, 2019 by , in Webinars ()

Presenter(s): Lori Wingate, Lyssa Wilson Becho, Mike Lesiecki
Date(s): February 20, 2019
Time: 1:00-2:00 p.m. EASTERN
Recording: https://youtu.be/64nXDeRm-9c

Surveys are a valuable source of evaluation data. Obtaining quality data relies heavily on well-crafted survey items that align with the overall purpose of the evaluation. In this webinar, participants will learn fundamental principles of survey question construction to enhance the validity and utility of survey data. We will discuss the importance of considering data analysis during survey construction and ways to test your survey questions. Participants will receive an overview of survey do’s and don’ts to help apply fundamental principles of survey question development in their own work.

Resources:
Slides
Handout

Webinar: Three Common Evaluation Fails and How to Prevent Them

Posted on December 4, 2018 by , in Webinars

Presenter(s): Kirk Knestis, Lori Wingate, Mike Lesiecki
Date(s): January 30, 2019
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/u1u2DssdLHc

In this webinar, experienced STEM education evaluator Kirk Knestis will share strategies for effectively communicating with evaluation clients to avoid three common “evaluation fails.” (1) Project implementation delays; (2) evaluation scope creep (clients wanting something more or different from what was originally planned); and (3) substantial changes in the project over the course of the evaluation. These issues are typical causes for an evaluation to be derailed and fail to produce useful and valid results. Webinar participants will learn how clear documentation—specifically, an evaluation contract (legal commitment to the work), scope of work (detailed description of evaluation services and deliverables), and study protocol (technical details concerning data collection and analysis)—can make potentially difficult conversations go better for all involved, averting potential evaluation crises and failures. Getting these documents right and using them in project communications helps ensure a smoothly operating evaluation, happy client, and profitable project for the evaluator

For a sneak peek of some of what Kirk will address in this webinar, see his blogpost, https://www.evalu-ate.org/blog/knestis-apr18/.

Resources:
Study Protocol Template
Evaluation Scope Template
Slides

Blog: The Life-Changing Magic of a Tidy Evaluation Plan

Posted on August 16, 2018 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

“Effective tidying involves only three essential actions. All you need to do is take the time to examine every item you own, decide whether or not you want to keep it, then choose where to put what you keep. Designate a place for each thing.”

―Marie Kondo, The Life-Changing Magic of Tidying Up

I’ve noticed a common problem with some proposal evaluation plans: It’s not so much that they don’t include key information; it’s that they lack order. They’re messy. When you have only about two pages of a 15-page National Science Foundation proposal to describe an evaluation, you need to be exceptionally clear and efficient. In this blog, I offer tips on how to “tidy up” your proposal’s evaluation plan to ensure it communicates key information clearly and coherently.

First of all, what does a messy evaluation plan look like? It meanders. It frames the evaluation’s focus in different ways in different places in the proposal, or even within the evaluation section itself, leaving the reviewer confused about the evaluation’s purpose. It discusses data and data collection without indicating what those data will be used to address. It employs different terms to mean the same thing in different places. It makes it hard for reviewers to discern key information from the evaluation plan and understand how that information fits together.

Three Steps to Tidy up a Messy Evaluation Plan

It’s actually pretty easy to convert a messy evaluation plan into a tidy one:

  • State the evaluation’s focus succinctly. List three to seven evaluation questions that the evaluation will address. These questions should encompass all of your planned data collection and analysis—no more, no less. Refer to these as needed later in the plan, rather than restating them differently or introducing new topics later in the plan. Do not express the evaluation’s focus in different ways in different places.
  • Link the data you plan to collect to the evaluation questions. An efficient way to do this is to present the information in a table. I like to include evaluation questions, indicators, data collection methods and sources, analysis, and interpretation in a single table to clearly show the linkages and convey that my team has carefully thought about how we will answer the evaluation questions. Bonus: Presenting information in a table saves space and makes it easy for reviewers to locate key information. (See EvaluATE’s Evaluation Data Matrix Template.)
  • Use straightforward language—consistently. Don’t assume that reviewers will share your definition of evaluation-related terms. Choose your terms carefully and do not vary how you use them throughout the proposal. For example, if you are using the terms measures, metrics, and indicators, ask yourself if you are really referring to different things. If not, stick with one term and use it consistently. If similar words are actually intended to mean different things, include brief definitions to avoid any confusion about your meaning.

Can a Tidy Evaluation Plan Really Change Your Life?

If it moves a very good proposal toward excellent, then yes! In the competitive world of grant funding, every incremental improvement counts and heightens your chances for funding, which can mean life-changing opportunities for the project leaders, evaluators, and—most importantly—individuals who will be served by the project.

Worksheet: Evaluation Data Matrix Template

Posted on August 16, 2018 by  in ,

An evaluation plan should include a clear description of what data will be collected, from what sources and how, by whom, and when, as well as how the data will be analyzed. Placing this information in a matrix helps ensure that there is a viable plan for collecting all the data necessary to answer each evaluation question and that all collected data will serve a specific, intended purpose. The table below may be copied into another document, such as a grant proposal, and edited/ expanded as needed. An example is provided on the next page.

File: Click Here
Type: Doc
Category: Data…, Evaluation Design
Author(s): Lori Wingate