Archive: evaluation report

Blog: The 1:3:25 Format for More Reader-Friendly Evaluation Reports

Posted on September 17, 2019 by  in Blog ()

Senior Research Associate, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m part of the EvaluATE team. I also lead evaluations as part of my work at Western Michigan University’s Evaluation Center, so I have written my fair share of evaluation reports over the years. I wanted to share a resource I’ve found to be game-changing for report writing. It’s the Canadian Health Services Research Foundation’s 1:3:25 reader-friendly report format. Even though I don’t follow the format exactly, what I’ve take away from the model has significantly improved the quality of my evaluation reports.

The 1:3:25 format for report writing consists of a one-page summary of main messages, a three-page executive summary, and a 25-page report body. Here’s a brief summary of each component:

1 Page for Main Messages: The main-messages page should contain an easy-to-scan bulleted list of information people can use to make decisions based on what was learned from the evaluation. This is not a summary of findings, but rather a compilation of key conclusions and recommendations that have implications for decision making. Think of the main-messages page as the go-to piece of the report for answering questions about what’s next.

3-Page Executive Summary: The purpose of the three-page executive summary is to provide an overview of the evaluation and help busy readers decide if your report will be useful to them. The executive summary should read more like a news article than an academic abstract. Information readers find most interesting should go first (i.e., conclusions and findings) and the less interesting information should go at the end (i.e., methods and background).

25-Page Report Body: The 25-page report body should contain information on the background of the project and its evaluation, and the evaluation methods, findings, conclusions, and recommendations. The order in which these sections are presented should correspond with the audience’s level of interest and familiarity with the project. Information that doesn’t fit in the 25-page report body can be placed in the appendices. Details that are critical for understanding the report should go in the report body; information that’s not critical for understanding the report should go in the appendices.

What I’ve found to be game-changing is having a specified page count to shoot for. With this information, I’ve gone from knowing my reports needed to be shorter to actually writing shorter reports. While I don’t always keep the report body to 25 pages, the practice of trying to keep it as close to 25 pages as possible has helped me shorten the length of my reports. At first, I was worried the shorter length would compromise the quality of the reports. Now, I feel as if I can have the best of both worlds: a report that is both reader friendly and transparent. The difference is that, now, many of the additional details are located in the appendices.

For more details, check out the Canadian Health Services Research Foundation’s guide on the 1:3:25 format.

Keywords: 1:3:25, reporting, evaluation report, evaluation reporting

Do’s and Don’ts: Basic Principles of Data Visualization

Posted on March 26, 2019 by , in Resources ()

A quick guide goes over the 14 do’s and don’ts of data visualization. This guide is not intended to teach these do’s and don’ts but rather serve as a reminder.

File: Click Here
Type: Doc
Category: Reporting & Use
Author(s): Emma Perk, Lyssa Wilson Becho

Webinar: Creating One-Page Reports

Posted on March 13, 2018 by , in Webinars ()

Presenter(s): Emma Perk, Lyssa Becho
Date(s): April 18, 2018
Time: 1-2 p.m. Eastern
Recording: https://youtu.be/V2TBfz24RpY

One-page evaluation reports are a great way to provide a snapshot of a project’s activities and impact to stakeholders such as advisory groups, college administrators, and NSF program officers. Summarizing key evaluation facts in a format that is easily and quickly digestible engages the busy reader and can make your project stand out.

Although traditional, long-form evaluation reports are still an excellent way to distribute evaluation results, one-page reports increase the engagement, understanding, and use of evaluation for both the current grant and leveraging findings with potential follow-up grants.

In this webinar, we will provide you with the tools and resources you need to create effective one-page reports and share some examples that have worked well in our practice.

One-Page Report Resources

Resources:
10 steps to creating one-page reports
One-page report worksheet
Slides
South Seattle One-Page Report

Blog: What Goes Where? Reporting Evaluation Results to NSF

Posted on April 26, 2017 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

In this blog, I provide advice for Advanced Technological Education (ATE) principal investigators (PIs) on how to include information from their project evaluations in their annual reports to the National Science Foundation (NSF).

Annual reports for NSF grants are due within 90 days of the award’s anniversary date. That means if your project’s initial award date was September 1, your annual reports will be due between June and August each year until the final year of the grant (at which point an outcome report is due within 90 days after the award anniversary date).

When you prepare your first annual report for NSF at Research.gov, you may be surprised to see there is no specific request for results from your project’s evaluation or a prompt to upload your evaluation report. That’s because Research.gov is the online reporting system used by all NSF grantees, whether they are researching fish populations in Wisconsin lakes or developing technician education programs.  So what do you do with the evaluation report your external evaluator prepared or all the great information in it?

1. Report evidence from your evaluation in the relevant sections of your annual report.

The Research.gov system for annual reports includes seven sections: Cover, Accomplishments, Products, Participants, Impact, Changes/Problems, and Special Requirements. Findings and conclusions from your evaluation should be reported in the Accomplishments and Impact sections, as described in the table below. Sometimes evaluation findings will point to a need for changes in project implementation or even its goals. In this case, pertinent evidence should be reported in the Changes/Problems section of the annual report. Highlight the most important evaluation findings and conclusions in these report sections. Refer to the full evaluation report for additional details (see Point 2 below).

NSF annual report section What to report from your evaluation
Accomplishments
  • Number of participants in various activities
  • Data related to participant engagement and satisfaction
  • Data related to the development and dissemination of products (Note: The Products section of the annual report is simply for listing products, not reporting evaluative information about them.)
Impacts
  • Evidence of the nature and magnitude of changes brought about by project activities, such as changes in individual knowledge, skills, attitudes, or behaviors or larger institutional, community, or workforce conditions
  • Evidence of increased participation by members of groups historically underrepresented in STEM
  • Evidence of the project’s contributions to the development of infrastructure that supports STEM education and research, including physical resources, such as labs and instruments; institutional policies; and enhanced access to scientific information
Changes/Problems
  • Evidence of shortcomings or opportunities that point to a need for substantial changes in the project

Do you have a logic model that delineates your project’s activities, outputs, and outcomes? Is your evaluation report organized around the elements in your logic model? If so, a straightforward rule of thumb is to follow that logic model structure and report evidence related to your project activities and outputs in the Accomplishments section and evidence related to your project outcomes in the Impacts section of your NSF annual report.

2. Upload your evaluation report.

Include your project’s most recent evaluation report as a supporting file in the Accomplishments or Impact section of Research.gov. If the report is longer than about 25 pages, make sure it includes a 1-3 page executive summary that highlights key results. Your NSF program officer is very interested in your evaluation results, but probably doesn’t have time to carefully read lengthy reports from all the projects he or she oversees.

Report: Science Festival Alliance Infographic

Posted on December 16, 2016 by , in Resources ()

This report is the culmination of a three-year National Science Foundation project. The Goodman Research Group was asked by their client to create a public-facing report, in contrast to a report for more internal use.

The authors shared this document with EvaluATE as an example of a user-friendly evaluation report. They made a concerted effort to remove charts and graphics from the report while also effectively summarizing a large mass of qualitative interview data in the form of a visual narrative. The information is organized into three sections according to the evaluation goals/questions to help frame the report for the reader. The report uses the client’s colors so it could be easily incorporated and viewed on their website, sciencefestivals.org.

 

File: Click Here
Type: Report
Category: Reporting & Use
Author(s): Elizabeth Bachrach, Grace Bachman

Checklist: Program Evaluation Report Content

Posted on December 13, 2016 by , in Resources ()

This checklist identifies and describes the elements of an evaluation report. It is intended to serve as a flexible guide for determining an evaluation report’s content. It should not be treated as a rigid set of requirements. An evaluation client’s or sponsor’s reporting requirements should take precedence over the checklist’s recommendations. This checklist is strictly focused on the content of long-form technical evaluation reports.

File: Click Here
Type: Checklist
Category: Resources
Author(s): Kelly Robertson, Lori Wingate

Blog: Tips for Evaluation Recommendations

Posted on June 3, 2015 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

This week I am in Atlanta at the American Evaluation Association (AEA) Summer Evaluation Institute, presenting a workshop on Translating Evaluation Findings into Actionable Recommendations.  Although the art of crafting practical, evidence-based recommendations is not covered in-depth either in evaluation textbooks or academic courses, most evaluators (86% according to Fleischer and Christie’s  survey of AEA members) believe that making recommendations  is part of an evaluator’s job. By reading as much as I can on this topic[1] and reflecting on my own practice, I have assembled 14 tips for how to develop, present, and follow-up on evaluation recommendations:

DEVELOP

  1. Determine the nature of recommendations needed or expected.  At the design stage, ask stakeholders: What do you hope to learn from the evaluation? What decisions will be influenced by the results? Should the evaluation include recommendations?
  2. Generate possible recommendations throughout the evaluation. Keep a log of ideas as you collect data and observe the program. I like Roberts-Gray, Buller, and Sparkman’s (1987) evaluation question-driven framework.
  3. Base recommendations on evaluation findings and other credible sources. Findings are important, but they’re often not sufficient for formulating recommendations.  Look to other credible sources, such as program goals, stakeholders/program participants, published research, experts, and the program’s logic model.
  4. Engage stakeholders in developing and/or reviewing recommendations prior to their finalization. Clients should not be surprised by anything in an evaluation report, including the recommendations. If you can engage stakeholders directly in developing recommendations, they will feel more ownership. (Read Adrienne Adam’s article about a great process for this).
  5. Focus recommendations on actions within the control of intended users. If the evaluation client doesn’t have control over the policy governing their programs, don’t bother recommending changes at that level.
  6. Provide multiple options for achieving desired results.  Balance consideration of the cost and difficulty of implementing recommendations with the degree of improvement expected; if possible, offer alternatives so stakeholders can select what is most feasible and important to do.

PRESENT

  1. Clearly distinguish between findings and recommendations. Evaluation findings reflect what is, recommendations are a predication about what could be. Developing recommendations requires a separate reasoning process.
  2. Write recommendations in clear, action-oriented language. I often see words like consider, attend to, recognize, and acknowledge in recommendations. Those call the clients’ attention to an issue, but don’t provide guidance as to what to do.
  3. Specify the justification sources for each recommendation. It may not be necessary to include this information in an evaluation report, but be prepared to explain how and why you came up with the recommendations.
  4. Explain the costs, benefits, and challenges associated with implementing recommendations. Provide realistic forecasts of these matters so clients can make informed decisions about whether to implement the recommendations.
  5. Be considerate—exercise political and interpersonal sensitivity. Avoid “red flag” words like fail and lack, don’t blame or embarrass, and be respectful of cultural and organizational values.
  6. Organize recommendations, such as by type, focus, timing, audience, and/or priority. If many recommendations are provided, organize them to help the client digest the information and prioritize their actions.

FOLLOW-UP

  1. Meet with stakeholders to review and discuss recommendations in their final form.  This is an opportunity to make sure they fully understand the recommendations as well as to lay the groundwork for action.
  2. Facilitate decision making and action planning around recommendations. I like the United Nations Development Programme’s “Management Response Template” as an action planning tool.

See also my handy one-pager of these tips for evaluation recommendations.

[1] See especially Hendricks & Papagiannis (1990) and Utilization-Focused Evaluation (4th ed.) by Michael Quinn Patton.

Newsletter: How is an NSF Project Outcomes Report Different from a Final Annual Report?

Posted on April 1, 2015 by  in Newsletter - ()

Director of Research, The Evaluation Center at Western Michigan University

All NSF projects awarded in January 2010 or later are required to submit a project outcomes report within 90 days of the grant’s expiration, along with a final annual report. In addition to the fact that a project outcomes report is a few paragraphs (200-800 words) and annual reports are typically several pages long, there are three other ways a project outcomes report is distinct from a final annual report.

1. A project outcomes report is solely about outcomes. A final annual report addresses many other topics. Project outcomes reports should describe what a project developed and the changes it brought about with regard to advancing knowledge (intellectual merit) and contributing to desired social outcomes (broader impacts). The focus should be on products and results, not project implementation. Publications are important evidence of intellectual merit, and a list of publications will be generated automatically from the project’s annual reports submitted to Research.gov. Other products generated with grant funds should be listed, such as data sets, software, or educational materials. If these products are available online, links may be provided.1 An accounting of grant products demonstrates a project’s productivity and intellectual merit. To address the project’s broader impacts, reports should highlight achievements in areas such as increasing participation in STEM by underrepresented minorities, improving teaching and learning, and developing the technical workforce.

2. A project outcomes report provides a “complete picture of the results” of a project.2 A final annual report covers the last year of the project only. A project outcomes report is not a progress report. It is the final word on what a project achieved and produced. PIs should think carefully about how they want their work to be portrayed to the public for decades to come and craft their reports accordingly. Dr. Joan Strassman of Washington University provides this cogent advice about crafting outcomes reports:

[A project outcomes report] is where someone … can go to see where NSF is spending its tax dollars. This document is not the plan, not the hopes, but the actual outcomes, so this potential reader can get direct information on what the researcher says she did. It pulls up along with the original funding abstracts, so see to it they coordinate as much as possible. Work hard to be clear, accurate, and compelling. (Read more at bit.ly/blog-POR)

3. A project outcomes report is a public document.3 A final annual report goes to the project’s NSF program officer only. A big difference between these audiences is that a project’s program officer probably has expertise in the project’s content area and is certainly familiar with the overall aims of the program through which the project was funded. For the benefit of lay readers, project outcomes report authors should use plain language to ensure comprehension by the general public (see plainlanguage.gov). Authors may check the report’s readability by having a colleague from outside the project’s content area review it. It’s important to include complete, yet succinct documentation that is readily understandable by individuals outside the project’s content area.

1 ATE grants awarded in 2014 or later are required to archive their materials with ATE Central.

2 For NSF’s guidelines regarding project outcomes reports, see bit.ly/POR-FAQs.

3 To access ATE project outcomes reports: (1) Go to bit.ly/NSF-POR (2) Enter “ATE” in the keyword box; (3) Check the box for “Show Only Awards with Project Outcomes Reports.”

Newsletter: Project Spotlight: MatEdu

Posted on April 1, 2015 by  in Newsletter - ()

Principal Investigator, MatEdu, Edmonds Community College

Mel Cossette is principal investigator for the National Resource Center for Materials Technology Education at Edmonds Community College. MatEdU’s mission is to advance materials technology education nationally. 

Q: What advice would you give to a PI in their first reporting period?

A: First, check the Reporting section of Research.gov to confirm the annual report due date. Sometimes a first time PI refers to their award date or start date, but it’s actually the due date listed on this website that is critical. Second, connect with your evaluator and inform him or her of the report due date. This helps with the planning and writing processes and assists with identifying information to be shared early on in the process. This does not mean things cannot change, but it is essential that the evaluator and PI communicate.

Q: How do you use your evaluation results in your annual report to NSF?

A: Typically, we create a rough draft of the annual report, from our perspective, which we share with our evaluator. The evaluator reviews and provides feedback. In the meantime, we continue building our report, paying attention to the different categories within the report, such as accomplishments, significant activities, products developed, etc. During this time, our evaluator develops a draft report that is shared with us. Although the reports have a different focus and are written using different formats, we compare content from the two reports. That helps us to be succinct with the data and information the reports are requiring. We find that this collaborative process helps to keep our team focused on the task at hand.

Q: What are some things that make an evaluation report useful (from a PI’s perspective)?

A: Because the information is coming from a semi-external perspective, we get the chance to compare the evaluation report on our activities, successes, areas that may need review, etc., to our activity timeline. This helps to limit scope creep. The recommendations from our evaluator also enabled us to identify a potential gap in our activities that needs to be addressed. PIs are usually completely focused on their projects and annual reports, so having an external evaluator point out successes, gaps, inconsistencies and data points reinforces progress and project direction.