We EvaluATE - Reporting

Blog: Repackaging Evaluation Reports for Maximum Impact

Posted on March 20, 2019 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Emma Perk Lyssa Wilson Becho
Managing Director
EvaluATE
Research Manager
EvaluATE

Evaluation reports take a lot of time to produce and are packed full of valuable information. To get the most out of your reports, think about “repackaging” your traditional report into smaller pieces.

Repackaging involves breaking up a long-form evaluation report into digestible pieces to target different audiences and their specific information needs. The goals of repackaging are to increase stakeholders’ engagement with evaluation findings, increase their understanding, and expand their use.

Let’s think about how we communicate data to various readers. Bill Shander from Beehive Media created the 4×4 Model for Knowledge Content, which illustrates different levels at which data can be communicated. We have adapted this model for use within the evaluation field. As you can see below, there are four levels, and each has a different type of deliverable associated with it. We are going to walk through these four levels and how an evaluation report can be broken up into digestible pieces for targeted audiences.

Figure 1. The four levels of delivering evaluative findings (image adapted from Shander’s 4×4 Model for Knowledge Content).

The first level, the Water Cooler, is for quick, easily digestible data pieces. The idea is to intrigue your viewer to want to learn more using a single piece of data from your report. Examples include a headline in a newspaper, a postcard, or social media post. In a social media post, you should include a graphic (photo or graph), a catchy title, and a link to the next communication level’s document. This information should be succinct and exciting. Use this level to catch the attention of readers who might not otherwise be invested in your project.

Figure 2. Example of social media post at the Water Cooler level.

The Café level allows you to highlight three to five key pieces of data that you really want to share. A Café level deliverable is great for busy stakeholders who need to know detailed information but don’t have time to read a full report. Examples include one-page reports, a short PowerPoint deck, and short briefs. Make sure to include a link to your full evaluation report to encourage the reader to move on to the next communication level.

Figure 3. One-page report at the Café level.

The Research Library is the level at which we find the traditional evaluation report. Deliverables at this level require the reader to have an interest in the topic and to spend a substantial amount of time to digest the information.

Figure 4. Full evaluation report at the Research Library level.

The Lab is the most intensive and involved level of data communication. Here, readers have a chance to interact with the data. This level goes beyond a static report and allows stakeholders to personalize the data for their interests. For those who have the knowledge and expertise in creating dashboards and interactive data, providing data at the Lab level is a great way to engage with your audience and allow the reader to manipulate the data to their needs.

Figure 5: Data dashboard example from Tableau Public Gallery (click image to interact with the data).

We hope this blog has sparked some interest in the different ways an evaluation report can be repackaged. Different audiences have different information needs and different amounts of time to spend reviewing reports. We encourage both project staff and evaluators to consider who their intended audience is and what would be the best level to communicate their findings. Then use these ideas to create content specific for that audience.

Blog: Evaluation Reporting with Adobe Spark

Posted on March 8, 2019 by , , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

 

Ouen Hunter Emma Perk Michael Harnar
Doctoral Student
The Evaluation Center
Managing Director
EvaluATE
Assistant Professor of Interdisciplinary
Ph.D. in Evaluation
The Evaluation Center

This blog was originally published on AEA365 on December 28, 2018: https://aea365.org/blog/evaluation-reporting-with-adobe-spark-by-ouen-hunter-and-emma-perk/

Hi! We are Ouen Hunter (student at the Interdisciplinary Ph.D. in Evaluation Program, IDPE), Emma Perk (project manager at The Evaluation Center), and Michael Harnar (assistant professor at the IDPE) from Western Michigan University. Recently, we used PhotoVoice in our evaluation of an Upward Bound program and wanted to share how we reported our PhotoVoice findings using the cost-free version of Adobe Spark.

Adobe Spark offers templates to make webpages, videos, flyers, reports, and more. It also hosts your product online for free. While there is a paid version of Adobe Spark, everything we discuss in this blog can be done using the free version. The software is very straightforward, and we were able to get our report online within an hour. We chose to create a webpage to increase accessibility for a large audience.

The free version of Adobe Spark has a lot of features, but it can be difficult to customize the layout. Therefore, we created our layouts in PowerPoint then uploaded them to Spark. This enabled us to customize the font, alignment, and illustrations. Follow these instructions to create a similar webpage:

  • Create a slide deck in PowerPoint. Use one slide per photo and text from the participant. The first slide serves as a template for the rest.
  • After creating the slides, you have a few options for saving the photos for upload.
    1. Use a snipping tool (Windows’ snipping or Mac’s screenshot function) to take a picture of each slide and save it as a PNG file.
    2. Save each as a picture in PowerPoint by selecting the image and the speech bubble, right clicking, and saving as a picture.
    3. Export as a PNG in PowerPoint. Go to File > Export then select PNG under the File Format drop-down menu. This will save all the slides as individual image files.
  • Create a webpage in Adobe Spark.
          1. Once on the site, you will be prompted to start a new account (unless you’re a returning user). This will allow your projects to be stored and give you access to create in the software.
          2. You have the option to change the theme to match your program or branding by selecting the Theme button.
          3. Once you have selected your theme, you are ready to add a title and upload the photos you created from PowerPoint. To upload the photos, press the plus icon. 
          4. Then select Photo. 
          5. Select Upload Photo. Add all photos and confirm the arrangement.
          6. After finalizing, remember to post the page online and click Share to give out the link. 

Though we used Adobe Spark to share our PhotoVoice results, there are many applications for using Spark. We encourage you to check out Adobe Spark to see how you can use it to share your evaluation results.

Hot Tips and Features:

  • Adobe Spark adjusts automatically for handheld devices.
  • Adobe Spark also automatically adjusts lines for you. No need to use a virtual ruler.
  • There are themes available with the free subscription, making it easy to design the webpage.
  • Select multiple photos during your upload. Adobe Spark will automatically separate each file for you.

*Disclaimer: Adobe Spark didn’t pay us anything for this blog. We wanted to share this amazing find with the evaluation community!

Blog: Creating Interactive Documents

Posted on June 20, 2018 by  in Blog ()

Executive Director, Healthy Climate Alliance

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

In 2016, I was an intern at the Evaluation Office (EVAL) of the International Labour Organization, where the constant question was, “How do we get people to read the reports that we spend so much time and energy on?” I had been looking for a new project that would be useful to my colleagues in EVAL, and a bolt of inspiration hit me: what if I could use the key points and information from one of the dense reports to make an interactive summary report? That project led me to the general concept of interactive documents, which can be used for reports, timelines, logic models, and more.

I recommend building interactive documents in PowerPoint and then exporting them as PDFs. I use Adobe Acrobat Pro to add clickable areas to the PDF that will lead readers to a particular section of the PDF or to a webpage. Interactive documents are not intended to be read from beginning to end. It should be easy for readers to navigate directly from the front page to the content that interests them, and back to the front page.

While building my interactive documents in PowerPoint, I follow Nancy Duarte’s Slidedocs principles to create visual documents that are intended to be read rather than presented. She suggests providing content that is clear and concise, using small chunks of text, and interspersing visuals. I use multiple narrow columns of text, with visuals on each page.

Interactive documents include a “launch page,” which gives a map-like overview of the whole document.

The launch page (see figure) allows readers to absorb the structure and main points of the document and to decide where they want to “zoom in” for more detail. I try to follow the wise advice of Edward Tufte: “Don’t ‘know your audience.’ Know your content and trust your audience.” He argues that we shouldn’t try to distill key points and simplify our data to make it easier for audiences to absorb. Readers will each have their own agendas and priorities, and we should make it as easy as possible for them to access the data that is most useful to them.

The launch page of an interactive document should have links all over it; every item of content on the launch page should lead readers to more detailed information on that topic. Every subsequent page should be extremely focused on one topic. If there is too much content within one topic, you can create another launch page focused on that particular topic (e.g., the “Inputs” section of the logic model).

The content pages should have buttons (i.e., links) that allow readers to navigate back to the main launch page or forward to the following page. If there’s a more detailed document that you’re building from, you may also want to link to that document on every page.

Try it out! Remember to keep your interactive document concise and navigable.

Blog: Summarizing Project Milestones

Posted on March 28, 2018 by  in Blog ()

Evaluation Specialist, Thomas P. Miller & Associates

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

With any initiative, it can be valuable to document and describe the implementation to understand what occurred and what shifts or changes were made to the original design (e.g., fidelity to the model). This understanding helps when replicating, scaling, or seeking future funding for the initiative.

Documentation can be done by the evaluator and be shared with the grantee (as a way to validate an evaluator’s understanding of the project). Alternatively, project staff can document progress and share this with the evaluator as a way to keep the evaluation team up to date (which is especially helpful on small-budget evaluation projects).

The documentation of progress can be extremely detailed or high level (e.g., a snapshot of the initiative’s development). When tracking implementation milestones, consider:

  1. What is the goal of the document?
  2. Who is the audience?
  3. What are the most effective ways to display and group the data?

For example, if you are interested in understanding a snapshot of milestones and modifications of the original project design, you might use a structure like the one below:

click to enlarge and download

If you are especially interested in highlighting the effect of delays on project implementation and the cause, you may adjust the visual to include directional arrows and shading:

click to enlarge and download

In these examples, we organized the snapshot by quarterly progress, but you can group milestones by month or even include a timeline of the events. Similarly, in Image 2 we categorized progress in buckets (e.g., curriculum, staffing) based on key areas of the grant’s goals and activities. These categories should change to align with the unique focus of each initiative. For example, if professional development is a considerable part of the grant, then perhaps placing that into a separate category (instead of combining it with staffing) would be best.

Another important consideration is the target audience. We have used this framework when communicating with project staff and leadership to show, at a high level, what is taking place within the project. This diagramming has also been valuable for sharing knowledge across our evaluation staff members, leading to discussions around fidelity to the model and any shifts or changes that may need to occur within the evaluation design, based on project implementation. Some of your stakeholders, such as project funders, may want more information than just the snapshot. In these cases, you may consider adding additional detail to the snapshot visual, or starting your report with the snapshot and then providing an additional narrative around each bucket and/or time period covered within the visual.

Also, the framework itself can be modified. If, for example, you are more concerned about showing the cause and effect instead of adjustments, you may group everything together as “milestones” instead of having separate categories for “adjustments” and “additional milestones.”

For our evaluation team, this approach has been a helpful way to consolidate, disseminate, and discuss initiative milestones with key stakeholder groups such as initiative staff, evaluators, college leadership, and funders. We hope this will be valuable to you as well.

Blog: Overcoming Writer’s Block – Strategies for Writing Your NSF Annual Report

Posted on February 14, 2018 by  in Blog ()

Supervisor, Grant Projects, Columbus State Community College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

For many new grantees and seasoned principal investigators, nothing is more daunting than an email from Research.gov titled “Annual Project Report Is NOW DUE.” In this blog post, I will help tackle the challenge of writing the annual project report by highlighting the strategy Columbus State Community College has developed for effectively writing annual reports and discussing why this strategy also serves as a planning and feedback tool.

Columbus State’s strategy for Advanced Technological Education (ATE) annual reporting developed organically, with input and collaboration from faculty, staff, and external evaluators, and is based on three key components:

  • shared knowledge of reporting requirements and format,
  • a structured annual reporting timeline, and
  • best-practice sharing and learning from past experience.

This three-pronged approach was utilized by four ATE projects during 2017 and builds on the old adage that “you only get out what you put in.” The key to annual reporting, which also serves as an important planning and feedback tool, is the adoption of a structured annual reporting timeline. The 10-week timeline outlined below ensures that adequate time is dedicated to writing the annual report. The timeline is designed to be collaborative and spur discussion around key milestones, lessons learned, and next steps for revising and improving project plans.

PREPARE

Week 1: Communicating Reporting Requirements

Weeks 1-3: Planning and Data Collection

  • All team members should actively participate in the planning and data collection phase.
  • Project teams should collect a wide breadth of information related to project achievements and milestones. Common types of information collected include individual progress updates, work samples, project work plans and documentation, survey and evaluation feedback, and project metrics.

Week 4: Group Brainstorming

  • Schedule a 60- to 90-minute meeting that focuses specifically on brainstorming and discussing content for the annual report. Include all project team members and your evaluator.
  • Use the project reports template to guide the conversation.

WRITE

Weeks 5-6: First Draft Writing and Clarification Seeking

  • All information is compiled by the project team and assembled into a first draft.
  • It may be useful to mirror the format of a grant proposal or project narrative during this phase to ensure that all project areas are addressed and considered.
  • The focus of this stage is ensuring that all information is accurately captured and integrated.

REVISE

Week 7: First Draft Review

  • First drafts should be reviewed by the project team and two to three people outside of the project team.
  • Including individuals from both inside and outside of the project team will help ensure that useful content is not omitted and that content is presented in an accessible manner.

Weeks 8-9: Final Revisions

  • Feedback and comments are evaluated and final revisions are made.

Week 10: Annual Report Submission

  • The final version of the annual report, with appendices and the evaluation report, is uploaded and submitted through Research.gov.

For additional information about Columbus State’s writing tips, please view our full white paper.

Read more from the community here.

Blog: One Pagers: Simple and Engaging Reporting

Posted on December 20, 2017 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Blog: One Pagers Simple and Engaging Reporting

Traditional, long-form reports are often used to detail the depth and specifics of an evaluation. However, many readers simply don’t have the time or bandwidth to digest a 30-page report. Distilling the key information into one page can help catch the eye of busy program staff, college administrators, or policy makers.

When we say “one pager,” we mean a single-page document that summarizes evaluation data, findings, or recommendations. It’s generally a stand-alone document that supplements a longer report, dataset, or presentation.

One pagers are a great way to get your client the data they need to guide data-driven decisions. These summaries can work well as companion documents for long reports or as a highlight piece for an interim report. We created a 10-step process to help facilitate the creation of a one pager. Additional materials are available, including detailed slides, grid layouts, videos, and more.

Ten-step process for creating a one pager:

1. Identify the audience

Be specific about who you are talking to and their information priorities. The content and layout of the document should be tailored to meet the needs of this audience.

2. Identify the purpose

Write a purpose statement that identifies why you are creating the one pager. This will help you decide what information to include or to exclude.

3. Prioritize your information

Categorize the information most relevant to your audience. Then rank each category from highest to lowest priority to help inform layout of the document.

4. Choose a grid

Use a grid to intentionally organize elements visually for readers. Check out our free pre-made grids, which you can use for your own one pagers, and instructions on how to use them in PowerPoint (video).

5. Draft the layout

Print out your grid layout and sketch your design by hand. This will allow you to think creatively without technological barriers and will save you time.

6. Create an intentional visual path

Pay attention to how the reader’s eye moves around the page. Use elements like large numbers, ink density, and icons to guide the reader’s visual path. Keep in mind the page symmetry and need to balance visual density. For more tips, see Canva’s Design Elements and Principles.

7. Create a purposeful hierarchy

Use headings intentionally to help your readers navigate and identify the content.

8. Use white space

The brain subconsciously views content grouped together as a cohesive unit. Add white space to indicate that a new section is starting.

9. Get feedback

Run your designs by a colleague or client to help catch errors, note areas needing clarification, and ensure the document makes sense to others. You will likely need to go through a few rounds of feedback before the document is finalized.

10. Triple-check consistency

Triple-check, and possibly quadruple-check, for consistency of fonts, alignment, size, and colors. Style guides can be a useful way to keep track of consistency in and across documents. Take a look at EvaluATE’s style guide here.

The demand for one pagers is growing, and now you are equipped with the information you need to succeed in creating one. So, start creating your one pagers now!

Vlog: Checklist for Program Evaluation Report Content

Posted on December 6, 2017 by  in Blog ()

Senior Research Associate, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

This video provides an overview of EvaluATE’s Checklist for Program Evaluation Report Content, and three reasons why this checklist is useful to evaluators and clients.

Blog: Reporting Anticipated, Questionable, and Unintended Project Outcomes

Posted on August 16, 2017 by  in Blog ()

Education Administrator, Independent

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Project evaluators are aware that evaluation aims to support learning and improvement. Through a series of planned interactions, event observations, and document reviews, the evaluator is charged with reporting to the project leadership team and ultimately the project’s funding agency, informing audiences of the project’s merit. This is not to suggest that reporting should only aim to identify positive impacts and outcomes of the project. Equally, there is substantive value in informing audiences of unintended and unattained project outcomes.

Evaluation reporting should discuss aspects of the project’s outcomes, whether anticipated, questionable, or unintended. When examining project outcomes the evaluator analyzes obtained information and facilitates project leadership through reflective thinking exercises for the purpose of defining the significance of the project and summarizing why outcomes matter.

Let’s be clear, outcomes are not to be regarded as something negative. In fact, with the projects that I have evaluated over the years, outcomes have frequently served as an introspective platform informing future curriculum decisions and directions internal to the institutional funding recipient. For example, the outcomes of one STEM project that focused on renewable energy technicians provided the institution with information that prompted the development of subsequent proposals and projects targeting engineering pathways.

Discussion and reporting of project outcomes also encapsulates lessons learned and affords the opportunity for the evaluator to ask questions such as:

  • Did the project increase the presence of the target group in identified STEM programs?
  • What initiatives will be sustained during post funding to maintain an increased presence of the target group in STEM programs?
  • Did project activities contribute to the retention/completion rates of the target group in identified STEM programs?
  • Which activities seemed to have the greatest/least impact on retention/completion rates?
  • On reflection, are there activities that could have more significantly contributed to retention/completion rates that were not implemented as part of the project?
  • To what extent did the project supply regional industries with a more diverse STEM workforce?
  • What effect will this have on regional industries during post project funding?
  • Were partners identified in the proposal realistic contributors to the funded project? Did they ensure a successful implementation enabling the attainment of anticipated outcomes?
  • What was learned about the characteristics of “good” and “bad” partners?
  • What are characteristics to look for and avoid to maximize productivity with future work?

Factors influencing outcomes include, but are not limited to:

  • Institutional changes, e.g., leadership;
  • Partner constraints or changes; and
  • Project/budgetary limitations.

In some instances, it is not unusual for the proposed project to be somewhat grandiose in identifying intended outcomes. Yet, when project implementation gets underway, intended activities may be compromised by external challenges. For example, when equipment is needed to support various aspects of a project, procurement and production channels may contribute to delays in equipment acquisition, thus adversely effecting project leadership’s ability to launch planned components of the project.

As a tip, it is worthwhile for those seeking funding to pose the outcome questions at the front-end of the project – when the proposal is being developed. Doing this will assist them in conceptualizing the intellectual merit and impact of the proposed project.

Resources and Links:

Developing an Effective Evaluation Report: Setting the Course for Effective Program Evaluation. Atlanta, Georgia: Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health, Division of Nutrition, Physical Activity and Obesity, 2013.

Blog: What Goes Where? Reporting Evaluation Results to NSF

Posted on April 26, 2017 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

In this blog, I provide advice for Advanced Technological Education (ATE) principal investigators (PIs) on how to include information from their project evaluations in their annual reports to the National Science Foundation (NSF).

Annual reports for NSF grants are due within 90 days of the award’s anniversary date. That means if your project’s initial award date was September 1, your annual reports will be due between June and August each year until the final year of the grant (at which point an outcome report is due within 90 days after the award anniversary date).

When you prepare your first annual report for NSF at Research.gov, you may be surprised to see there is no specific request for results from your project’s evaluation or a prompt to upload your evaluation report. That’s because Research.gov is the online reporting system used by all NSF grantees, whether they are researching fish populations in Wisconsin lakes or developing technician education programs.  So what do you do with the evaluation report your external evaluator prepared or all the great information in it?

1. Report evidence from your evaluation in the relevant sections of your annual report.

The Research.gov system for annual reports includes seven sections: Cover, Accomplishments, Products, Participants, Impact, Changes/Problems, and Special Requirements. Findings and conclusions from your evaluation should be reported in the Accomplishments and Impact sections, as described in the table below. Sometimes evaluation findings will point to a need for changes in project implementation or even its goals. In this case, pertinent evidence should be reported in the Changes/Problems section of the annual report. Highlight the most important evaluation findings and conclusions in these report sections. Refer to the full evaluation report for additional details (see Point 2 below).

NSF annual report section What to report from your evaluation
Accomplishments
  • Number of participants in various activities
  • Data related to participant engagement and satisfaction
  • Data related to the development and dissemination of products (Note: The Products section of the annual report is simply for listing products, not reporting evaluative information about them.)
Impacts
  • Evidence of the nature and magnitude of changes brought about by project activities, such as changes in individual knowledge, skills, attitudes, or behaviors or larger institutional, community, or workforce conditions
  • Evidence of increased participation by members of groups historically underrepresented in STEM
  • Evidence of the project’s contributions to the development of infrastructure that supports STEM education and research, including physical resources, such as labs and instruments; institutional policies; and enhanced access to scientific information
Changes/Problems
  • Evidence of shortcomings or opportunities that point to a need for substantial changes in the project

Do you have a logic model that delineates your project’s activities, outputs, and outcomes? Is your evaluation report organized around the elements in your logic model? If so, a straightforward rule of thumb is to follow that logic model structure and report evidence related to your project activities and outputs in the Accomplishments section and evidence related to your project outcomes in the Impacts section of your NSF annual report.

2. Upload your evaluation report.

Include your project’s most recent evaluation report as a supporting file in the Accomplishments or Impact section of Research.gov. If the report is longer than about 25 pages, make sure it includes a 1-3 page executive summary that highlights key results. Your NSF program officer is very interested in your evaluation results, but probably doesn’t have time to carefully read lengthy reports from all the projects he or she oversees.

Blog: Declutter Your Reports: The Checklist for Straightforward Evaluation Reports

Posted on February 1, 2017 by  in Blog (, )

Senior Research Associate, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Evaluation reports have a reputation for being long, overly complicated, and impractical. The recent buzz about fresh starts and tidying up for the new year got me thinking about the similarities between these infamous evaluation reports and the disastrously cluttered homes featured on reality makeover shows. The towering piles of stuff overflowing from these homes reminds me of the technical language and details that clutter up so many evaluation reports. Informational clutter, like physical clutter, can turn reports, just like homes, into difficult-to-navigate obstacle courses that can render the contents virtually unusable. If you are looking for ideas on how to organize and declutter your reports, check out the Checklist for Straightforward Evaluation Reports that Lori Wingate and I developed. The checklist provides guidance on how to produce comprehensive evaluation reports that are concise, easy to understand, and easy to navigate. Main features of the checklist include:

  • Quick reference sheet: A one-page summary of content to include in an evaluation report and tips for presenting content in a straightforward manner.
  • Detailed checklist: A list and description of possible content to include in each report section.
  • Straightforward reporting tips: General and section-specific suggestions on how to present content in a straightforward manner.
  • Recommended resources: List of resources that expand on information presented in the checklist.

Evaluators, evaluation clients, or other stakeholders can use the report to set reporting expectations such as what content to include and how to present information.

Straightforward Reporting Tips

Here are some tips, inspired by the checklist, on how to tidy up your reports:

  • Use short sentences: Each sentence should communicate one idea. Sentences should contain no more than 25 words. Downsize your words to only the essentials, just like you might downsize your closet.
  • Use headings: Use concise and descriptive headings and subheadings to clearly label and distinguish report sections. Use report headings, like labels on boxes, to make it easier to locate items in the future.
  • Organize results by evaluation questions: Organize the evaluation results section by evaluation question with separate subheadings for findings and conclusions under each evaluation question. Just like most people don’t put decorations for various holidays in one box, don’t put findings for various evaluation questions in one findings section.
  • Present takeaway messages: Label each figure with a numbered title and separate takeaway message. Similarly, use callout to grab readers’ attention and highlight takeaway messages. For example, use a callout in the results section to summarize the conclusion in one-sentence under the evaluation question.
  • Minimize report body length: Reduce page length as much as possible without compromising quality. One way to do this is to place details that enhance understanding—but are not critical for basic understanding—in the appendices. Only information that is critical for readers’ understanding of the evaluation process and results should be included in the report body. Think of the appendices like a storage area such as a basement, attic, or shed where you keep items you need but don’t use all the time.

If you’d like to provide feedback you can write your comments in an email or return a review form to info@evalu-ate.org. We are especially interested in getting feedback from individuals that have used the checklist as they develop evaluation reports.