A quick guide goes over the 14 do’s and don’ts of data visualization. This guide is not intended to teach these do’s and don’ts but rather serve as a reminder.
One-page evaluation reports are a great way to provide a snapshot of a project’s activities and impact to stakeholders such as advisory groups, college administrators, and NSF program officers. Summarizing key evaluation facts in a format that is easily and quickly digestible engages the busy reader and can make your project stand out.
Although traditional, long-form evaluation reports are still an excellent way to distribute evaluation results, one-page reports increase the engagement, understanding, and use of evaluation for both the current grant and leveraging findings with potential follow-up grants.
In this webinar, we will provide you with the tools and resources you need to create effective one-page reports and share some examples that have worked well in our practice.
In this blog, I provide advice for Advanced Technological Education (ATE) principal investigators (PIs) on how to include information from their project evaluations in their annual reports to the National Science Foundation (NSF).
Annual reports for NSF grants are due within 90 days of the award’s anniversary date. That means if your project’s initial award date was September 1, your annual reports will be due between June and August each year until the final year of the grant (at which point an outcome report is due within 90 days after the award anniversary date).
When you prepare your first annual report for NSF at Research.gov, you may be surprised to see there is no specific request for results from your project’s evaluation or a prompt to upload your evaluation report. That’s because Research.gov is the online reporting system used by all NSF grantees, whether they are researching fish populations in Wisconsin lakes or developing technician education programs. So what do you do with the evaluation report your external evaluator prepared or all the great information in it?
1. Report evidence from your evaluation in the relevant sections of your annual report.
The Research.gov system for annual reports includes seven sections: Cover, Accomplishments, Products, Participants, Impact, Changes/Problems, and Special Requirements. Findings and conclusions from your evaluation should be reported in the Accomplishments and Impact sections, as described in the table below. Sometimes evaluation findings will point to a need for changes in project implementation or even its goals. In this case, pertinent evidence should be reported in the Changes/Problems section of the annual report. Highlight the most important evaluation findings and conclusions in these report sections. Refer to the full evaluation report for additional details (see Point 2 below).
|NSF annual report section||What to report from your evaluation|
Do you have a logic model that delineates your project’s activities, outputs, and outcomes? Is your evaluation report organized around the elements in your logic model? If so, a straightforward rule of thumb is to follow that logic model structure and report evidence related to your project activities and outputs in the Accomplishments section and evidence related to your project outcomes in the Impacts section of your NSF annual report.
2. Upload your evaluation report.
Include your project’s most recent evaluation report as a supporting file in the Accomplishments or Impact section of Research.gov. If the report is longer than about 25 pages, make sure it includes a 1-3 page executive summary that highlights key results. Your NSF program officer is very interested in your evaluation results, but probably doesn’t have time to carefully read lengthy reports from all the projects he or she oversees.
This report is the culmination of a three-year National Science Foundation project. The Goodman Research Group was asked by their client to create a public-facing report, in contrast to a report for more internal use.
The authors shared this document with EvaluATE as an example of a user-friendly evaluation report. They made a concerted effort to remove charts and graphics from the report while also effectively summarizing a large mass of qualitative interview data in the form of a visual narrative. The information is organized into three sections according to the evaluation goals/questions to help frame the report for the reader. The report uses the client’s colors so it could be easily incorporated and viewed on their website, sciencefestivals.org.
This checklist identifies and describes the elements of an evaluation report. It is intended to serve as a flexible guide for determining an evaluation report’s content. It should not be treated as a rigid set of requirements. An evaluation client’s or sponsor’s reporting requirements should take precedence over the checklist’s recommendations. This checklist is strictly focused on the content of long-form technical evaluation reports.
This week I am in Atlanta at the American Evaluation Association (AEA) Summer Evaluation Institute, presenting a workshop on Translating Evaluation Findings into Actionable Recommendations. Although the art of crafting practical, evidence-based recommendations is not covered in-depth either in evaluation textbooks or academic courses, most evaluators (86% according to Fleischer and Christie’s survey of AEA members) believe that making recommendations is part of an evaluator’s job. By reading as much as I can on this topic and reflecting on my own practice, I have assembled 14 tips for how to develop, present, and follow-up on evaluation recommendations:
- Determine the nature of recommendations needed or expected. At the design stage, ask stakeholders: What do you hope to learn from the evaluation? What decisions will be influenced by the results? Should the evaluation include recommendations?
- Generate possible recommendations throughout the evaluation. Keep a log of ideas as you collect data and observe the program. I like Roberts-Gray, Buller, and Sparkman’s (1987) evaluation question-driven framework.
- Base recommendations on evaluation findings and other credible sources. Findings are important, but they’re often not sufficient for formulating recommendations. Look to other credible sources, such as program goals, stakeholders/program participants, published research, experts, and the program’s logic model.
- Engage stakeholders in developing and/or reviewing recommendations prior to their finalization. Clients should not be surprised by anything in an evaluation report, including the recommendations. If you can engage stakeholders directly in developing recommendations, they will feel more ownership. (Read Adrienne Adam’s article about a great process for this).
- Focus recommendations on actions within the control of intended users. If the evaluation client doesn’t have control over the policy governing their programs, don’t bother recommending changes at that level.
- Provide multiple options for achieving desired results. Balance consideration of the cost and difficulty of implementing recommendations with the degree of improvement expected; if possible, offer alternatives so stakeholders can select what is most feasible and important to do.
- Clearly distinguish between findings and recommendations. Evaluation findings reflect what is, recommendations are a predication about what could be. Developing recommendations requires a separate reasoning process.
- Write recommendations in clear, action-oriented language. I often see words like consider, attend to, recognize, and acknowledge in recommendations. Those call the clients’ attention to an issue, but don’t provide guidance as to what to do.
- Specify the justification sources for each recommendation. It may not be necessary to include this information in an evaluation report, but be prepared to explain how and why you came up with the recommendations.
- Explain the costs, benefits, and challenges associated with implementing recommendations. Provide realistic forecasts of these matters so clients can make informed decisions about whether to implement the recommendations.
- Be considerate—exercise political and interpersonal sensitivity. Avoid “red flag” words like fail and lack, don’t blame or embarrass, and be respectful of cultural and organizational values.
- Organize recommendations, such as by type, focus, timing, audience, and/or priority. If many recommendations are provided, organize them to help the client digest the information and prioritize their actions.
- Meet with stakeholders to review and discuss recommendations in their final form. This is an opportunity to make sure they fully understand the recommendations as well as to lay the groundwork for action.
- Facilitate decision making and action planning around recommendations. I like the United Nations Development Programme’s “Management Response Template” as an action planning tool.
See also my handy one-pager of these tips for evaluation recommendations.
All NSF projects awarded in January 2010 or later are required to submit a project outcomes report within 90 days of the grant’s expiration, along with a final annual report. In addition to the fact that a project outcomes report is a few paragraphs (200-800 words) and annual reports are typically several pages long, there are three other ways a project outcomes report is distinct from a final annual report.
1. A project outcomes report is solely about outcomes. A final annual report addresses many other topics. Project outcomes reports should describe what a project developed and the changes it brought about with regard to advancing knowledge (intellectual merit) and contributing to desired social outcomes (broader impacts). The focus should be on products and results, not project implementation. Publications are important evidence of intellectual merit, and a list of publications will be generated automatically from the project’s annual reports submitted to Research.gov. Other products generated with grant funds should be listed, such as data sets, software, or educational materials. If these products are available online, links may be provided.1 An accounting of grant products demonstrates a project’s productivity and intellectual merit. To address the project’s broader impacts, reports should highlight achievements in areas such as increasing participation in STEM by underrepresented minorities, improving teaching and learning, and developing the technical workforce.
2. A project outcomes report provides a “complete picture of the results” of a project.2 A final annual report covers the last year of the project only. A project outcomes report is not a progress report. It is the final word on what a project achieved and produced. PIs should think carefully about how they want their work to be portrayed to the public for decades to come and craft their reports accordingly. Dr. Joan Strassman of Washington University provides this cogent advice about crafting outcomes reports:
[A project outcomes report] is where someone … can go to see where NSF is spending its tax dollars. This document is not the plan, not the hopes, but the actual outcomes, so this potential reader can get direct information on what the researcher says she did. It pulls up along with the original funding abstracts, so see to it they coordinate as much as possible. Work hard to be clear, accurate, and compelling. (Read more at bit.ly/blog-POR)
3. A project outcomes report is a public document.3 A final annual report goes to the project’s NSF program officer only. A big difference between these audiences is that a project’s program officer probably has expertise in the project’s content area and is certainly familiar with the overall aims of the program through which the project was funded. For the benefit of lay readers, project outcomes report authors should use plain language to ensure comprehension by the general public (see plainlanguage.gov). Authors may check the report’s readability by having a colleague from outside the project’s content area review it. It’s important to include complete, yet succinct documentation that is readily understandable by individuals outside the project’s content area.
1 ATE grants awarded in 2014 or later are required to archive their materials with ATE Central.
2 For NSF’s guidelines regarding project outcomes reports, see bit.ly/POR-FAQs.
3 To access ATE project outcomes reports: (1) Go to bit.ly/NSF-POR (2) Enter “ATE” in the keyword box; (3) Check the box for “Show Only Awards with Project Outcomes Reports.”
Mel Cossette is principal investigator for the National Resource Center for Materials Technology Education at Edmonds Community College. MatEdU’s mission is to advance materials technology education nationally.
Q: What advice would you give to a PI in their first reporting period?
A: First, check the Reporting section of Research.gov to confirm the annual report due date. Sometimes a first time PI refers to their award date or start date, but it’s actually the due date listed on this website that is critical. Second, connect with your evaluator and inform him or her of the report due date. This helps with the planning and writing processes and assists with identifying information to be shared early on in the process. This does not mean things cannot change, but it is essential that the evaluator and PI communicate.
Q: How do you use your evaluation results in your annual report to NSF?
A: Typically, we create a rough draft of the annual report, from our perspective, which we share with our evaluator. The evaluator reviews and provides feedback. In the meantime, we continue building our report, paying attention to the different categories within the report, such as accomplishments, significant activities, products developed, etc. During this time, our evaluator develops a draft report that is shared with us. Although the reports have a different focus and are written using different formats, we compare content from the two reports. That helps us to be succinct with the data and information the reports are requiring. We find that this collaborative process helps to keep our team focused on the task at hand.
Q: What are some things that make an evaluation report useful (from a PI’s perspective)?
A: Because the information is coming from a semi-external perspective, we get the chance to compare the evaluation report on our activities, successes, areas that may need review, etc., to our activity timeline. This helps to limit scope creep. The recommendations from our evaluator also enabled us to identify a potential gap in our activities that needs to be addressed. PIs are usually completely focused on their projects and annual reports, so having an external evaluator point out successes, gaps, inconsistencies and data points reinforces progress and project direction.