Lori Wingate

Director of Research, The Evaluation Center at Western Michigan University

Lori has a Ph.D. in evaluation and more than 20 years of experience in the field of program evaluation. She directs EvaluATE and leads and a variety of evaluation projects at WMU focused on STEM education, health, and higher education initiatives. Dr. Wingate has led numerous webinars and workshops on evaluation in a variety of contexts, including CDC University and the American Evaluation Association Summer Evaluation Institute. She is an associate member of the graduate faculty at WMU.


Template: ATE Proposal Evaluation Plan

Posted on July 13, 2017 by  in

This template is for use in preparing the evaluation plan sections for proposals to the National Science Foundation’s Advanced Technological Education (ATE) program. It is based the ATE Evaluation Planning Checklist, also developed by EvaluATE. It is aligned with the evaluation guidance included in the 2017 ATE Program Solicitation. All proposers should read the solicitation in full.

File: Click Here
Type: Worksheet
Category: Resources
Author(s): Lori Wingate

Resource: Finding and Selecting an Evaluator for Advanced Technological Education (ATE) Proposals

Posted on July 13, 2017 by  in

All ATE proposals are required to request “funds to support an evaluator independent of the project.” Ideally, this external evaluator should be identified in the project proposal. The information in this guide is for individuals who are able to select and work with an external evaluator at the proposal stage. However, some institutions prohibit selecting an evaluator on a noncompetitive basis in advance of an award being made. Advice for individuals in that situation is provided in an EvaluATE blog and newsletter article.

This guide includes advice on how to locate and select an external evaluator. It is not intended as a guide for developing an evaluation plan or contracting with an evaluator.

File: Click Here
Type: Doc
Category: Resources
Author(s): Lori Wingate

Template: Evaluator Biographical Sketch

Posted on July 13, 2017 by  in

This template was created by EvaluATE. It is based on the National Science Foundation’s guidelines for preparing biographical sketches for senior project personnel. The information about what evaluators should include in Products and Synergistic Activities sections are EvaluATE’s suggestions, not NSF requirements. The biosketch must not exceed two pages.

File: Click Here
Type: Worksheet
Category: Resources
Author(s): Lori Wingate

Webinar: Evaluation: All the Funded ATE Proposals Are Doing It

Posted on July 10, 2017 by  in Webinars

Presenter(s): Lori Wingate
Date(s): August 16, 2017
Time: 1:00-2:00 p.m. Eastern

Give your proposal a competitive edge with a strong evaluation plan. The National Science Foundation has issued a new solicitation for its Advanced Technological Education (ATE) program. It includes major changes to the guidelines for ATE evaluation plans. Attend this webinar to learn the key elements of a winning evaluation plan and strategies for demonstrating to reviewers that evaluation is an integral part of your project, not an afterthought. In addition, we’ll provide you with specific guidance for how to budget for an evaluation, locate a qualified evaluator, and describe results from prior NSF support with supporting evaluative evidence. You will receive an updated Evaluation Planning Checklist for ATE Proposals and other tools to help prepare strong evaluation plans.

Register

Pilot Testing: Checklist for Program Evaluation Report Content

Posted on June 15, 2017 by , in

EvaluATE invites individuals who are currently writing evaluation reports to pilot test the Checklist for Program Evaluation Report Content and provide feedback for improvement by August 15, 2017. 

A form for providing feedback is available from bit.ly/rep-check-pilot

After a few questions about the context of your work, this form will prompt you to answer three open-ended questions about your experience with the checklist:

  1. What was especially helpful about this checklist?
  2. What frustrated or confused you?
  3. What would you add, change, or remove?
  4. If using this checklist affected how you wrote your report or what you included in it, please describe how it influenced you.

About the Checklist: this checklist identifies and describes the elements of an evaluation report. This checklist should not be treated as a rigid set of requirements. Rather, it should be used as a flexible guide for determining an evaluation report’s content. Each checklist element is a prompt for decision making about what content is appropriate for a particular evaluation context should be made with consideration of the report audience’s information needs and resources available for report development.

Blog: What Goes Where? Reporting Evaluation Results to NSF

Posted on April 26, 2017 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

In this blog, I provide advice for Advanced Technological Education (ATE) principal investigators (PIs) on how to include information from their project evaluations in their annual reports to the National Science Foundation (NSF).

Annual reports for NSF grants are due within 90 days of the award’s anniversary date. That means if your project’s initial award date was September 1, your annual reports will be due between June and August each year until the final year of the grant (at which point an outcome report is due within 90 days after the award anniversary date).

When you prepare your first annual report for NSF at Research.gov, you may be surprised to see there is no specific request for results from your project’s evaluation or a prompt to upload your evaluation report. That’s because Research.gov is the online reporting system used by all NSF grantees, whether they are researching fish populations in Wisconsin lakes or developing technician education programs.  So what do you do with the evaluation report your external evaluator prepared or all the great information in it?

1. Report evidence from your evaluation in the relevant sections of your annual report.

The Research.gov system for annual reports includes seven sections: Cover, Accomplishments, Products, Participants, Impact, Changes/Problems, and Special Requirements. Findings and conclusions from your evaluation should be reported in the Accomplishments and Impact sections, as described in the table below. Sometimes evaluation findings will point to a need for changes in project implementation or even its goals. In this case, pertinent evidence should be reported in the Changes/Problems section of the annual report. Highlight the most important evaluation findings and conclusions in these report sections. Refer to the full evaluation report for additional details (see Point 2 below).

NSF annual report section What to report from your evaluation
Accomplishments
  • Number of participants in various activities
  • Data related to participant engagement and satisfaction
  • Data related to the development and dissemination of products (Note: The Products section of the annual report is simply for listing products, not reporting evaluative information about them.)
Impacts
  • Evidence of the nature and magnitude of changes brought about by project activities, such as changes in individual knowledge, skills, attitudes, or behaviors or larger institutional, community, or workforce conditions
  • Evidence of increased participation by members of groups historically underrepresented in STEM
  • Evidence of the project’s contributions to the development of infrastructure that supports STEM education and research, including physical resources, such as labs and instruments; institutional policies; and enhanced access to scientific information
Changes/Problems
  • Evidence of shortcomings or opportunities that point to a need for substantial changes in the project

Do you have a logic model that delineates your project’s activities, outputs, and outcomes? Is your evaluation report organized around the elements in your logic model? If so, a straightforward rule of thumb is to follow that logic model structure and report evidence related to your project activities and outputs in the Accomplishments section and evidence related to your project outcomes in the Impacts section of your NSF annual report.

2. Upload your evaluation report.

Include your project’s most recent evaluation report as a supporting file in the Accomplishments or Impact section of Research.gov. If the report is longer than about 25 pages, make sure it includes a 1-3 page executive summary that highlights key results. Your NSF program officer is very interested in your evaluation results, but probably doesn’t have time to carefully read lengthy reports from all the projects he or she oversees.

Video: Introduction to Evaluation for Mentor-Connect Cohort 2017

Posted on April 25, 2017 by  in Videos ()

This video was created for the 2017 Mentor-Connect Cohort, but can be applicable to others interested in learning about ATE Evaluation. Specifically this video provides an overview of What is project evaluation?, Why does NSF require evaluation?, How do you plan for evaluation?, and How can EvaluATE help?

File: Click Here
Type: Video
Category: video
Author(s): Lori Wingate

Blog: Scavenging Evaluation Data

Posted on January 17, 2017 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

But little Mouse, you are not alone,
In proving foresight may be vain:
The best laid schemes of mice and men
Go often askew,
And leave us nothing but grief and pain,
For promised joy!

From To a Mouse, by Robert Burns (1785), modern English version

Research and evaluation textbooks are filled with elegant designs for studies that will illuminate our understanding of social phenomena and programs. But as any evaluator will tell you, the real world is fraught with all manner of hazards and imperfect conditions that wreak havoc on design, bringing grief and pain, rather than the promised joy of a well-executed evaluation.

Probably the biggest hindrance to executing planned designs is that evaluation is just not the most important thing to most people. (GASP!) They are reluctant to give two minutes for a short survey, let alone an hour for a focus group. Your email imploring them to participate in your data collection effort is one of hundreds of requests for their time and attention that they are bombarded with daily.

So, do all the things the textbooks tell you to do. Take the time to develop a sound evaluation design and do your best to follow it. Establish expectations early with project participants and other stakeholders about the importance of their cooperation. Use known best practices to enhance participation and response rates.

In addition: Be a data scavenger. Here are two ways to get data for an evaluation that do not require hunting down project participants and convincing them to give you information.

1. Document what the project is doing.

I have seen a lot of evaluation reports in which evaluators painstakingly recount a project’s activities as a tedious story rather than straightforward account. This task typically requires the evaluator to ask many questions of project staff, pore through documents, and track down materials. It is much more efficient for project staff to keep a record of their own activities. For example, see EvaluATE’s resume. It is a no-nonsense record of our funding, activities, dissemination, scholarship, personnel, and contributors.  In and of itself, our resume does most of the work of the accountability aspect of our evaluation (i.e., Did we do what we promised?).  In addition, the resume can be used to address questions like these:

  • Is the project advancing knowledge, as evidenced by peer-reviewed publications and presentations?
  • Is the project’s productivity adequate in relation to its resources (funding and personnel)?
  • To what extent is the project leveraging the expertise of the ATE community?

2. Track participation.

If your project holds large events, use a sign-in sheet to get attendance numbers. If you hold webinars, you almost certainly have records with information about registrants and attendees. If you hold smaller events, pass around a sign-in sheet asking for basic information like name, institution, email address, and job title (or major if it’s a student group). If the project has developed a course, get enrollment information from the registrar.  Most importantly: Don’t put these records in a drawer. Compile them in a spreadsheet and analyze the heck out of them. Here are example data points that we glean from EvaluATE’s participation records:

  • Number of attendees
  • Number of attendees from various types of organizations (such as two- and four-year colleges, nonprofits, government agencies, and international organizations)
  • Number and percentage of attendees who return for subsequent events
  • Geographic distribution of attendees

Project documentation and participation data will be most helpful for process evaluation and accountability. You will still need cooperation from participants for outcome evaluation—and you should engage them early to garner their interest and support for evaluation efforts. Still, you may be surprised by how much valuable information you can get from these two sources—documentation of activities and participation records—with minimal effort.

Get creative about other data you can scavenge, such as institutional data that colleges already collect; website data, such as Google Analytics; and citation analytics for published articles.

Checklist: Program Evaluation Report Content

Posted on December 13, 2016 by , in

This checklist identifies and describes the elements of an evaluation report. This checklist should not be treated as a rigid set of requirements. Rather, it should be used as a flexible guide for determining an evaluation report’s content. Each checklist element is a prompt for decision making about what content is appropriate for a particular evaluation context should be made with consideration of the report audience’s information needs and resources available for report development.

File: Click Here
Type: Checklist
Category: Resources
Author(s): Kelly Robertson, Lori Wingate

Webinar: Outcome Evaluation: Step-by-Step

Posted on December 12, 2016 by , in Webinars

Presenter(s): Lori Wingate, Miranda Lee
Date(s): March 22, 2017
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/KGadLB–WZM

Outcome evaluation involves identifying and measuring the changes that occur as a result of project implementation. These changes may occur at the individual, organizational, or community levels and include changes in knowledge, skills, attitudes, behavior, and community/societal conditions. All too often, however, evaluations focus on project activities, rather than meaningful outcomes. Webinar participants will learn how to identify appropriate outcomes to assess in an evaluation and how to use those intended outcomes as a foundation for planning or enhancing data collection, analysis, and interpretation.

Resources:
Slides
Handout