Newsletter - Fall 2015

Newsletter: Shorten the Evaluation Learning Curve: Avoid These Common Pitfalls

Posted on October 1, 2015 by  in Newsletter - () ()

Executive Director, The Evaluation Center at Western Michigan University

This EvaluATE newsletter issue is focused on getting started with evaluation. It’s oriented to new ATE principal investigators who are getting their projects off the ground, but I think it holds some good reminders for veteran PIs as well. To shorten the evaluation learning curve, avoid these common pitfalls:

Searching for the truth about “what NSF wants from evaluation.” NSF is not prescriptive about what an ATE evaluation should or shouldn’t look like. So, if you’ve been concerned that you’ve somehow missed the one document that spells out exactly what NSF wants from an ATE evaluation—rest assured, you haven’t overlooked anything. But there is information that NSF requests from all projects in annual reports and that you are asked to report on the annual ATE survey. So it’s worthwhile to preview the reporting template ( and the ATE annual survey questions ( And if you’re doing research, be sure to review the Common Guidelines for Education Research and Development – which are pretty cut-and-dried criteria for different types of research ( Most importantly, put some time into thinking about what you, as a project leader, need to learn from the evaluation. If you’re still concerned about meeting expectations, talk to your program officer.

Thinking your evaluator has all the answers. Even for veteran evaluators, every evaluation is new and has to be tailored to context. Don’t expect your evaluator to produce a detailed, actionable evaluation plan on Day 1. He or she will need to work out the details of the plan with you. And if something doesn’t seem right to you, it’s OK to ask for something different.

Putting off dealing with the evaluation until you are less busy. “Less busy” is a mythical place and you will probably never get there. I am both an evaluator and a client of evaluation services, and even I have been guilty of paying less attention to evaluation in favor of “more urgent” matters. Here are some tips for ensuring your project’s evaluation gets the attention it needs: (a) Set a recurring conference call or meeting with your evaluator (e.g., every two to three weeks); (b) Put evaluation at the top of your project team’s meeting agendas, or hold separate meetings to focus exclusively on evaluation matters; (c) Give someone other than the PI responsibility for attending to the evaluation—not to replace the PI’s attention, but to ensure the PI and other project members are staying on top of the evaluation and communicating regularly with the evaluator; (d) Commit to using the evaluation results in a timely way—if you do something on a recurring basis, make sure you gather feedback from those involved and use it to improve the next activity.

Assuming you will need your first evaluation report at the end of Year 1. PIs must submit their annual reports to NSF within the 90 days prior to the end of the current budget period. So if your grant started on September 1, your first annual report is due between the beginning of June and the end of August. And it will take some time to prepare, so you should probably start writing a month or so before you plan to submit it. You’ll want to include at least some of your evaluation results, so start working with your evaluator now to figure what information is most important to collect for your Year 1 report.

Veteran PIs: What tips do you have for shortening the evaluation learning curve?  Submit a blog to EvaluATE and tell your story and lessons learned for the benefit of new PIs:

Newsletter: Collaborative Evaluation

Posted on October 1, 2015 by  in Newsletter - () ()

A collaborative evaluation is one “in which there is a significant degree of collaboration or cooperation between evaluators and stakeholders in planning and/or conducting the evaluation.”1

Project leaders who are new to grant project evaluation may assume that evaluation is something that is done to them, rather than something they do with an evaluator. Although the degree of collaboration may vary, it is generally advisable for project leaders to work closely with their evaluators on the following tasks:

Define the focus of an evaluation: Be clear about what you, as a project leader, need to learn from the evaluation to help improve your work and what you need to be able to report to NSF to demonstrate accountability and impact.

Minimize barriers to data collection: Inform your evaluator about the best times and places to gather data. If the evaluator needs to collect data directly from students or faculty, an advance note from you or another respected individual from your institution can help a great deal. Help your evaluator connect with your institutional research office or other sources of organizational data.

Review data collection instruments: Your evaluator has expertise in evaluation and research methods, but you know your project’s content area and audience best. Review instruments (e.g., questionnaires, interview/focus group protocols) to ensure they make sense for your audience.

To learn more, visit the website of the American Evaluation Association’s topical interest group on collaborative, participatory, and empowerment evaluation: (

1Cousins, J. B., Donohue, J. J., & Bloom, G. A. (1996). Collaborative evaluation in North America: Evaluators’ self-reported opinions, practices and consequences. American Journal of Evaluation, 17(3), p. 210.

Newsletter: How can you make sure your evaluation meets the needs of multiple stakeholders?

Posted on October 1, 2015 by  in Newsletter () ()

Executive Director, The Evaluation Center at Western Michigan University

We talk a lot about “stakeholders” in evaluation. These are the folks who are involved in, affected by, or simply interested in the evaluation of your project.  But what these stakeholders want or need to know from the evaluation, the time they have available for the evaluation, and their level of interest are probably quite variable.  Here is a generic guide to types of ATE evaluation stakeholders, what they might need, and how to meet those needs.

Stakeholder groups What they might need Tips for meeting those needs
Project leaders (PI, co-PIs)
  • Information that will help you make improvements to the project as it is unfolding
  • Results you can include in your annual reports to NSF to demonstrate accountability and impact
Communicate your needs clearly to your evaluator, including when you need the information in order to make use of it.
Advisory committees or National Visiting Committees
  • Results from the evaluation that show whether the project is on track for meeting its goals, if changes in direction or operations are warranted
  • Summary information about the projects’ strengths and weaknesses
Many advisory committee members donate their time, so they probably aren’t interested in reading lengthy reports.  Provide a brief memo and/or short presentation at meetings with key findings and invite questions about the evaluation. Be forthcoming about strengths and weaknesses.
Participants who provide data for the evaluation
  • Access to reports where their information was used
  • Summaries of what actions were taken based on the information they needed to provide
The most important thing for this group is to demonstrate use of the information they provided.  You can share reports, but a personal message from project leaders along the lines of “we heard you and here is what we’re doing in response” is most valuable.
NSF program officers
  • Evidence that the project is on track for meeting its goals
  • Evidence of impact (not just what was done, but what difference the work is making)
  • Evidence that the project is using evaluation results to make improvements
Focus on Intellectual Merit (the intrinsic quality of the work and potential to advance knowledge) and Broader Impacts (the tangible benefits for individuals and progress toward desired societal outcomes). If you’re not sure about what your program officer needs from your evaluation, ask him or her for clarification.
College administrators (department chairs, deans, executives, etc.)
  • Results that demonstrate impact on students, faculty, institutional culture, infrastructure, and reputation.
Make full reports available upon request, but most busy administrators probably don’t have the time to read technical reports or need the fine-grained data points. Prepare memos or share presentations that focus on the information they’re most interested in.
Partners and collaborators
  • Information that helps them assess the return on the investment of their time or other resources
See above – like with college administrators, focus on providing the information most pertinent to this group.

In case you didn’t read between the lines—the underlying message here is to provide stakeholders with the information that is most relevant to their particular “stake” in your project. A good way to not meet their needs is to only send everyone a long, detailed technical report with every data point collected. It’s good to have a full report available for those who request it, but many simply won’t have the time or level of interest needed to consume that quantity of evaluative information about your project. Most importantly, don’t take our word as to what they might need: Ask them!

Not sure what stakeholders to involve in your evaluation or how? Check out our worksheet on Identifying Stakeholders and Their Roles in an Evaluation at (

Newsletter: Creating an Evaluation Scope of Work

Posted on October 1, 2015 by  in Newsletter - ()

Executive Director, The Evaluation Center at Western Michigan University

One of the most common requests we get at EvaluATE is for examples of independent contractor agreements and scope of work statements for external evaluators. First, let’s be clear about the difference between these two types of documents.

An independent contractor agreement is typically 90 percent boilerplate language required by your institution. Here at Western Michigan University, contracts are run through one of three offices (Business Services, Research and Sponsored Programs, Grants and Contracts, or Purchasing), depending on the type of contract and the nature of the work/service. We can’t tell you the name of the office at your institution, but there definitely is one and they probably have boilerplate contract forms that you will need to use.

A scope of work statement should be attached to and referenced by the independent contractor agreement (or other type of contract). But unlike the contract, it should not be written in legalese, but in plain language understandable to all parties involved. The key issues to cover in a scope of work statement include the following:

Evaluation questions (or objectives): Including information about the purpose of the evaluation is a good reminder to those involved about why the evaluation is being done. It may serve as a useful reference down the road if the evaluation starts to experience scope creep (or shrinkage).

Main tasks and deliverables (with timelines or deadlines): This information should make clear what services and products the evaluator will provide. Common examples include a detailed evaluation plan (what was included in your proposal probably doesn’t have enough detail), data collection instruments, reports, and presentations.

It’s critical to include timelines (generally when things will occur) and deadlines (when they must be finished) in this statement.

Conditions for payment: You most likely specified a dollar amount for the evaluation in your grant proposal, but you probably do not plan on paying that in a lump sum either at the beginning or end of the evaluation or even yearly. Specify in what increments payments should be made and what conditions must be met for payment. Rather than tying payment(s) to certain dates, consider making payment(s)contingent on the completion of certain tasks or deliverables.

Be sure to come to agreement on these terms in collaboration with your evaluator. This is an opportunity to launch your working relationship from a place of open communication and shared expectations.

Newsletter: Project Spotlight: Manufacturing Associate Degree Education in Northwestern Connecticut

Posted on October 1, 2015 by  in Newsletter - ()

Professor, Biology, Northwestern Connecticut Community College

A conversation with Sharon Gusky, an ATE PI at Northwestern Connecticut Community College.

Q: Your ATE project started just over a year ago. What do you know now that you wish you’d known then about project evaluation?

A: I wish I had a better understanding of information that is useful to collect before the start of the grant so we would have been better prepared to capture baseline and impact data. This is our first NSF grant and it allowed us to start a new manufacturing program. The community was excited about and very supportive of it. The first year we received many requests to speak at events, do radio and cable TV shows, and visit high schools, but we did not have a way to capture the impact of these activities.

Q: What advice do you have for new PIs with regard to working with an evaluator?

A: Start working with your evaluator early and set clear timelines for checking in and reviewing and analyzing the data as it is collected. The information that you collect along the way can help shape the program. We learned early on through student interviews that they did not like the course schedule, which required them to wait a semester or summer to take the second technical course in a sequence.  We used their feedback to revise the schedule so that each course ran for eight weeks during a semester.  If we had waited until the end of the spring semester to find this out, it would have been too late to implement the change for fall.

Q: What challenges did you face in getting the evaluation off the ground?

A: We faced a number of scheduling challenges and miscommunication with regard to data collection.  We hadn’t clearly defined the roles of the various people involved—external evaluator, institutional research director, PI, and co-PIs.  We needed to sit down together and work out a plan so that the data we needed was being collected and shared.