Archive: Formative Evaluation

Blog: Overcoming Writer’s Block – Strategies for Writing Your NSF Annual Report

Posted on February 14, 2018 by  in Blog ()

Supervisor, Grant Projects, Columbus State Community College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

For many new grantees and seasoned principal investigators, nothing is more daunting than an email from Research.gov titled “Annual Project Report Is NOW DUE.” In this blog post, I will help tackle the challenge of writing the annual project report by highlighting the strategy Columbus State Community College has developed for effectively writing annual reports and discussing why this strategy also serves as a planning and feedback tool.

Columbus State’s strategy for Advanced Technological Education (ATE) annual reporting developed organically, with input and collaboration from faculty, staff, and external evaluators, and is based on three key components:

  • shared knowledge of reporting requirements and format,
  • a structured annual reporting timeline, and
  • best-practice sharing and learning from past experience.

This three-pronged approach was utilized by four ATE projects during 2017 and builds on the old adage that “you only get out what you put in.” The key to annual reporting, which also serves as an important planning and feedback tool, is the adoption of a structured annual reporting timeline. The 10-week timeline outlined below ensures that adequate time is dedicated to writing the annual report. The timeline is designed to be collaborative and spur discussion around key milestones, lessons learned, and next steps for revising and improving project plans.

PREPARE

Week 1: Communicating Reporting Requirements

Weeks 1-3: Planning and Data Collection

  • All team members should actively participate in the planning and data collection phase.
  • Project teams should collect a wide breadth of information related to project achievements and milestones. Common types of information collected include individual progress updates, work samples, project work plans and documentation, survey and evaluation feedback, and project metrics.

Week 4: Group Brainstorming

  • Schedule a 60- to 90-minute meeting that focuses specifically on brainstorming and discussing content for the annual report. Include all project team members and your evaluator.
  • Use the project reports template to guide the conversation.

WRITE

Weeks 5-6: First Draft Writing and Clarification Seeking

  • All information is compiled by the project team and assembled into a first draft.
  • It may be useful to mirror the format of a grant proposal or project narrative during this phase to ensure that all project areas are addressed and considered.
  • The focus of this stage is ensuring that all information is accurately captured and integrated.

REVISE

Week 7: First Draft Review

  • First drafts should be reviewed by the project team and two to three people outside of the project team.
  • Including individuals from both inside and outside of the project team will help ensure that useful content is not omitted and that content is presented in an accessible manner.

Weeks 8-9: Final Revisions

  • Feedback and comments are evaluated and final revisions are made.

Week 10: Annual Report Submission

  • The final version of the annual report, with appendices and the evaluation report, is uploaded and submitted through Research.gov.

For additional information about Columbus State’s writing tips, please view our full white paper.

Read more from the community here.

Blog: Evolution of Evaluation as ATE Grows Up

Posted on March 15, 2017 by  in Blog ()

Independent Consultant, Independent Consultant

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I attended a packed workshop by EvaluATE called “A Practical Approach to Outcome Evaluation” at the 2016 NSF ATE Principal Investigators Conference. Two lessons from the workshop reminded me that the most significant part of the evaluation process is the demystification of the process itself:

  • “Communicate early and often with human data sources about the importance of their cooperation.”
  • “Ensure everyone understands their responsibilities related to data collection.”

Stepping back, it made me reflect upon the evolution of evaluation in the ATE community. When I first started out in the ATE world in 1995, I was on the staff of one of the first ATE centers ever funded. Back then, being “evaluated” was perceived as quite a different experience, something akin to taking your first driver’s test or defending a dissertation—a meeting of the tester and the tested.

As the ATE community has matured, so has our approach to both evaluation and the integral communication component that goes with it. When we were a fledgling center, the meetings with our evaluator could have been a chance to take advantage of the evaluation team’s many years of experience of what works and what doesn’t. Yet, at the start we didn’t realize that it was a two-way street where both parties learned from each other. Twenty years ago, evaluator-center/project relationships were neither designed nor explained in that fashion.

Today, my colleague, Dr. Sandra Mikolaski, and I are co-evaluators for NSF ATE clients who range from a small new-to-ATE grant (they weren’t any of those back in the day!) to a large center grant that provides resources to a number of other centers and projects and even has its own internal evaluation team. The experience of working with our new-to-ATE client was perhaps what forced us to be highly thoughtful about how we hope both parties view their respective roles and input. Because the “fish don’t talk about the water” (i.e., project teams are often too close to their own work to honk their own horn), evaluators can provide not only perspective and advice, but also connections to related work and other project and center principal investigators. This perspective can have a tremendous impact on how activities are carried out and on the goals and objectives of a project.

We use EvaluATE webinars like “User-Friendly Evaluation Reports” and “Small-Scale Evaluation” as references and resources not only for ourselves but also for our clients. These webinars help them understand that an evaluation is not meant to assess and critique, but to inform, amplify, modify, and benefit.

We have learned from being on the other side of the fence that an ongoing dialog, an ethnographic approach (on-the-ground research, participant observation, holistic approach), and formative input-based partnership with our client makes for a more fruitful process for everyone.

Newsletter: Formative Evaluation

Posted on April 1, 2014 by  in Newsletter - ()

The most common purposes, or intended uses of evaluations, are often described by the terms formative evaluation and summative evaluation.

Formative evaluation focuses on evaluation for project improvement, in contrast with summative evaluation which uses evaluation results to make decisions about project adoption, expansion, contraction, continuation, or cancellation.

Since formative evaluation is all about project improvement, it needs to occur while there is still time to implement change. So, the earlier a formative evaluation can begin in an ATE project cycle, the better. Formative evaluation is also a recurring activity. As such, those who will be involved in implementing change (project leaders and staff) are the ones who will be the most interested in the results of a formative evaluation.

E. Jane Davidson notes in her book, Evaluation Methodology Basics, that there are two main areas in which formative evaluation is especially useful. Adapted for the ATE context those areas are:

  1. To help a new project “find its feet” by helping to improve project plans early in the award cycle. Another example for a new project is collecting early evidence of project relevancy from faculty and students, thus allowing changes to occur before full roll out of a project component.
  2. To assist more established projects improve their services, become more efficient with their grant dollars, or reach a larger audience. For projects looking for refunding, formative evaluation can assist in finding areas of improvement (even in long standing activities) to better respond to changing needs.