Kelly Robertson

Senior Research Associate, The Evaluation Center at Western Michigan University

Kelly has a Ph.D. in evaluation and more than eight years of experience in the field of evaluation. She works as a Senior Research Associate at The Evaluation Center at Western Michigan University. Dr. Robertson has worked on evaluations at the local, regional, national, and international levels, spanning a wide variety of sectors (e.g., STEM education, adult education, career and technical education, and evaluation capacity development). Her research interests primarily focus on evaluation as it relates to equity, cultural competence, and making evaluation more user-friendly.


Blog: Evaluation Plan Cheat Sheets: Using Evaluation Plan Summaries to Assist with Project Management

Posted on October 10, 2018 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

We are Kelly Robertson and Lyssa Wilson Becho, and we work on EvaluATE as well as several other projects at The Evaluation Center at Western Michigan University. We wanted to share a trick that has helped us keep track of our evaluation activities and better communicate the details of an evaluation plan with our clients. To do this, we take the most important information from an evaluation plan and create a summary that can serve as a quick-reference guide for the evaluation management process. We call these “evaluation plan cheat sheets.”

The content of each cheat sheet is determined by the information needs of the evaluation team and clients. Cheat sheets can serve the needs of the evaluation team (for example, providing quick reminders of delivery dates) or of the client (for example, giving a reminder of when data collection activities occur). Examples of items we like to include on our cheat sheets are shown in Figures 1-3 and include the following:

  • A summary of deliverables noting which evaluation questions each deliverable will answer. In the table at the top of Figure 1, we indicate which report will answer which evaluation question. Letting our clients know which questions are addressed in each deliverable helps to set their expectations for reporting. This is particularly useful for evaluations that require multiple types of deliverables.
  • A timeline of key data collection activities and report draft due dates. On the bottom of Figure 1, we visualize a timeline with simple icons and labels. This allows the user to easily scan the entirety of the evaluation plan. We recommend including important dates for deliverables and data collection. This helps both the evaluation team and the client stay on schedule.
  • A data collection matrix. This is especially useful for evaluations with a lot of data collection sources. The example shown in Figure 2 identifies who implements the instrument, when the instrument will be implemented, the purpose of the instrument, and the data source. It is helpful to identify who is responsible for data collection activities in the cheat sheet, so nothing gets missed. If the client is responsible for collecting much of the data in the evaluation plan, we include a visual breakdown of when data should be collected (shown at the bottom of Figure 2).
  • A progress table for evaluation deliverables. Despite the availability of project management software with fancy Gantt charts, sometimes we like to go back to basics. We reference a simple table, like the one in Figure 3, during our evaluation team meetings to provide an overview of the evaluation’s status and avoid getting bogged down in the details.

Importantly, include the client and evaluator contact information in the cheat sheet for quick reference (see Figure 1). We also find it useful to include a page footer with a “modified on” date that automatically updates when the document is saved. That way, if we need to update the plan, we can be sure we are working on the most recent version.

 

Figure 1. Cheat Sheet Example Page 1. (Click to enlarge.)

Figure 2. Cheat Sheet Example Page 2. (Click to enlarge)

Figure 3. Cheat Sheet Example Page 2 (Click to enlarge.)

 

Vlog: Checklist for Program Evaluation Report Content

Posted on December 6, 2017 by  in Blog ()

Senior Research Associate, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

This video provides an overview of EvaluATE’s Checklist for Program Evaluation Report Content, and three reasons why this checklist is useful to evaluators and clients.

Blog: Declutter Your Reports: The Checklist for Straightforward Evaluation Reports

Posted on February 1, 2017 by  in Blog (, )

Senior Research Associate, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Evaluation reports have a reputation for being long, overly complicated, and impractical. The recent buzz about fresh starts and tidying up for the new year got me thinking about the similarities between these infamous evaluation reports and the disastrously cluttered homes featured on reality makeover shows. The towering piles of stuff overflowing from these homes reminds me of the technical language and details that clutter up so many evaluation reports. Informational clutter, like physical clutter, can turn reports, just like homes, into difficult-to-navigate obstacle courses that can render the contents virtually unusable. If you are looking for ideas on how to organize and declutter your reports, check out the Checklist for Straightforward Evaluation Reports that Lori Wingate and I developed. The checklist provides guidance on how to produce comprehensive evaluation reports that are concise, easy to understand, and easy to navigate. Main features of the checklist include:

  • Quick reference sheet: A one-page summary of content to include in an evaluation report and tips for presenting content in a straightforward manner.
  • Detailed checklist: A list and description of possible content to include in each report section.
  • Straightforward reporting tips: General and section-specific suggestions on how to present content in a straightforward manner.
  • Recommended resources: List of resources that expand on information presented in the checklist.

Evaluators, evaluation clients, or other stakeholders can use the report to set reporting expectations such as what content to include and how to present information.

Straightforward Reporting Tips

Here are some tips, inspired by the checklist, on how to tidy up your reports:

  • Use short sentences: Each sentence should communicate one idea. Sentences should contain no more than 25 words. Downsize your words to only the essentials, just like you might downsize your closet.
  • Use headings: Use concise and descriptive headings and subheadings to clearly label and distinguish report sections. Use report headings, like labels on boxes, to make it easier to locate items in the future.
  • Organize results by evaluation questions: Organize the evaluation results section by evaluation question with separate subheadings for findings and conclusions under each evaluation question. Just like most people don’t put decorations for various holidays in one box, don’t put findings for various evaluation questions in one findings section.
  • Present takeaway messages: Label each figure with a numbered title and separate takeaway message. Similarly, use callout to grab readers’ attention and highlight takeaway messages. For example, use a callout in the results section to summarize the conclusion in one-sentence under the evaluation question.
  • Minimize report body length: Reduce page length as much as possible without compromising quality. One way to do this is to place details that enhance understanding—but are not critical for basic understanding—in the appendices. Only information that is critical for readers’ understanding of the evaluation process and results should be included in the report body. Think of the appendices like a storage area such as a basement, attic, or shed where you keep items you need but don’t use all the time.

If you’d like to provide feedback you can write your comments in an email or return a review form to info@evalu-ate.org. We are especially interested in getting feedback from individuals that have used the checklist as they develop evaluation reports.

Checklist: Program Evaluation Report Content

Posted on December 13, 2016 by , in

This checklist identifies and describes the elements of an evaluation report. It is intended to serve as a flexible guide for determining an evaluation report’s content. It should not be treated as a rigid set of requirements. An evaluation client’s or sponsor’s reporting requirements should take precedence over the checklist’s recommendations. This checklist is strictly focused on the content of long-form technical evaluation reports.

File: Click Here
Type: Checklist
Category: Resources
Author(s): Kelly Robertson, Lori Wingate

Webinar: Anatomy of a User-Friendly Evaluation Report

Posted on October 17, 2016 by , in Webinars

Presenter(s): Kelly Robertson, Lori Wingate
Date(s): December 14, 2016
Time: 1-2:00 p.m. Eastern
Recording: https://youtu.be/61oKkQ4EGfg

While long-form evaluation reports are the main way information about evaluations is communicated, they are largely underutilized. Barriers to report use most often relate to user-friendliness—the degree to which they are easy-to-navigate, easy-to-understand, and provide information that readers can use. In this webinar, participants will see first-hand what user-friendly evaluation reports look like and learn strategies for creating readable, meaningful, and useful reports. Participants will also be introduced to the Checklist for User-Friendly Evaluation Reports. Evaluators can use this checklist as a guide to developing reports, and project leaders can use it to communicate their needs and preferences for reports to their evaluators.

Resources:
Slides
Checklist for Straightforward Evaluation Reports
Example of a Infographic-Style Evaluation Report

2016 High Impact Technology Exchange Conference (HI-TEC)

Posted on July 15, 2016 by , in Conferences ()

2016 High Impact Technology Exchange Conference (HI-TEC)
Pittsburgh, PA
July 25-28, 2016

Workshop

Logic Models: The Swiss Army Knife of Project Planning and Evaluation
Kelly Robertson
Lyssa Wilson

July 27, 2016 | 3:45-4:30 p.m.

A logic model is a graphic depiction of how a project translates its resources and activities into outcomes. Logic models are useful tools for succinctly communicating a project’s goals and activities, but they have many other applications. They provide a foundation for a project evaluation plan (and subsequent reporting) and can be used to organize the content of a grant proposal.  In this session, participants will learn the basics of how to create a logic model and we will demonstrate its use for planning a project evaluation and organizing a grant proposal.  Participants will receive the Evaluation Planning Checklist for ATE Proposals and ATE Project Logic Model Template.

Participants will receive the Evaluation Planning Checklist for ATE Proposals and ATE Project Logic Model Template.

For more information about the conference, and for conference registration, please visit http://www.highimpact-tec.org/

Resources:
Slides
Handout

Newsletter: Bridging the Gap: Using Action Plans to Facilitate Evaluation Use

Posted on April 1, 2016 by  in Newsletter - ()

Senior Research Associate, The Evaluation Center at Western Michigan University

NSF requires evaluation, in part, because it is an essential tool for project improvement. Yet all too often, evaluation results are not used to inform project decision making. There is a gap in the project improvement cycle between dissemination of evaluation results and decision making. One way to bridge this gap is through use of an action plan for project improvement informed by evaluation findings, conclusions, and recommendations. The United Nations Development Programme’s (UNDP) “Management Response Template” (http://bit.ly/undp-mrt) provides a format for  such an action plan. The template was created to encourage greater use of evaluation by projects. UNDP’s template is designed for use in international development contexts, but could be used for any type of project, including ATE centers and projects.

As shown, the template is organized around evaluation recommendations or issues. Any important issue that emerged from an evaluation would be an appropriate focus for action planning. The form allows for documentation of evaluation-based decisions and tracking implementation of those decisions. The inclusion of a time frame for each key action, who is responsible, and status encourages structure and accountability around use of evaluation results and project improvement.

Resources table