Newsletter - Resources

Newsletter: ATE Logic Model Template

Posted on July 1, 2016 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

A logic model is a graphic depiction of how a project translates its resources into activities and outcomes. The ATE Project Logic Model Template presents the basic format for a logic model with question prompts and examples  to guide users in distilling their project plans into succinct statements about planned activities and products and desired outcomes. Paying attention to the prompts and ATE-specific examples will help users avoid common logic model mistakes, like placing outputs (tangible products) under outcomes (changes in people, organizations or conditions brought about through project activities and outputs).

The template is in PowerPoint so you may use the existing elements and start creating your own logic model right away—just delete the instructional parts of the document and input your project’s information.  We have found that when a document has several graphic elements, PowerPoint is easier to work in than Word.  Alternatively, you could create a simple table in Word that mirrors the layout in the template.

Formatting tips:

  • If you find you need special paper to print the logic model and maintain its legibility, it’s too complicated.  It should be readable on a 8.5” x 11” sheet of paper.  If you simply have too much information to include in a single page, include general summary statements/categories, and include detailed explanations in a proposal narrative or other project planning document.
  • You may wish to add arrows to connect specific activities to specific outputs or outcomes.  However, if you find that all activities are leading to all outcomes (and that is actually how the project is intended to work), there is no need to clutter your model with arrows leading everywhere.
  • Use a consistent font and font size.
  • Align, align, align! Alignment is one of the most important design principles. When logic model elements are out of alignment, it can make it seem messy and unprofessional.
  • Don’t worry if your logic model doesn’t capture all the subtle nuances of your project. It should provide an overview of what a project does and is intended to accomplish and  convey a clear logic as to how the pieces are connected.  Your proposal narrative or project plan is where the details go.

Download the template from http://bit.ly/lm-temp.

Newsletter: Bridging the Gap: Using Action Plans to Facilitate Evaluation Use

Posted on April 1, 2016 by  in Newsletter - ()

Senior Research Associate, The Evaluation Center at Western Michigan University

NSF requires evaluation, in part, because it is an essential tool for project improvement. Yet all too often, evaluation results are not used to inform project decision making. There is a gap in the project improvement cycle between dissemination of evaluation results and decision making. One way to bridge this gap is through use of an action plan for project improvement informed by evaluation findings, conclusions, and recommendations. The United Nations Development Programme’s (UNDP) “Management Response Template” (http://bit.ly/undp-mrt) provides a format for  such an action plan. The template was created to encourage greater use of evaluation by projects. UNDP’s template is designed for use in international development contexts, but could be used for any type of project, including ATE centers and projects.

As shown, the template is organized around evaluation recommendations or issues. Any important issue that emerged from an evaluation would be an appropriate focus for action planning. The form allows for documentation of evaluation-based decisions and tracking implementation of those decisions. The inclusion of a time frame for each key action, who is responsible, and status encourages structure and accountability around use of evaluation results and project improvement.

Resources table

Newsletter: Communicating Results from Prior NSF Support

Posted on January 1, 2016 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

ATE proposal season is many months away in early October, but if you are submitting for new funding this year, now is the time to reflect on your project’s achievements and make sure you will be able to write a compelling account of your current or past project’s results as they relate to the NSF review criteria of Intellectual Merit and Broader Impacts. A section titled Results from Prior NSF Support is required whenever a proposal PI or co-PI has received previous grants from NSF in the past five years. A proposal may be returned without review if it does not use the specific headings of “Intellectual Merit” and “Broader Impacts” when presenting results from prior support.

Given that these specific headings are required, you should have something to say about your project’s achievements in these distinct areas. It is OK for some projects to emphasize one area over another (Intellectual Merit or Broader Impacts), but grantees should be able to demonstrate value in both areas. Descriptions of achievements should be supported with evidence. Bold statements about a proposed project’s potential broader impacts, for example, will be more convincing to reviewers if the proposer can describe tangible benefits of previously funded work.

To help with this aspect of proposal development, EvaluATE has created a Results from Prior NSF Support Checklist (see http://bit.ly/prior-check). This one-page checklist lists the NSF requirements for this section of a proposal, as well as our additional suggestions for what to include and how.

Two EvaluATE blogs include additional guidance in this area: Amy Germuth (http://bit.ly/ag-reapply) offers specific guidance regarding wording and structure, and Lori Wingate (http://bit.ly/nsf-merit) shares tips for assessing the quality and quantity of evidence of a project’s Intellectual Merit and Broader Impacts, with links to helpful resources.

The task of identifying and collecting evidence of results from prior support should not wait until proposal writing time. It should be embedded in a project’s ongoing evaluation.

Newsletter: Creating an Evaluation Scope of Work

Posted on October 1, 2015 by  in Newsletter - ()

Director of Research, The Evaluation Center at Western Michigan University

One of the most common requests we get at EvaluATE is for examples of independent contractor agreements and scope of work statements for external evaluators. First, let’s be clear about the difference between these two types of documents.

An independent contractor agreement is typically 90 percent boilerplate language required by your institution. Here at Western Michigan University, contracts are run through one of three offices (Business Services, Research and Sponsored Programs, Grants and Contracts, or Purchasing), depending on the type of contract and the nature of the work/service. We can’t tell you the name of the office at your institution, but there definitely is one and they probably have boilerplate contract forms that you will need to use.

A scope of work statement should be attached to and referenced by the independent contractor agreement (or other type of contract). But unlike the contract, it should not be written in legalese, but in plain language understandable to all parties involved. The key issues to cover in a scope of work statement include the following:

Evaluation questions (or objectives): Including information about the purpose of the evaluation is a good reminder to those involved about why the evaluation is being done. It may serve as a useful reference down the road if the evaluation starts to experience scope creep (or shrinkage).

Main tasks and deliverables (with timelines or deadlines): This information should make clear what services and products the evaluator will provide. Common examples include a detailed evaluation plan (what was included in your proposal probably doesn’t have enough detail), data collection instruments, reports, and presentations.

It’s critical to include timelines (generally when things will occur) and deadlines (when they must be finished) in this statement.

Conditions for payment: You most likely specified a dollar amount for the evaluation in your grant proposal, but you probably do not plan on paying that in a lump sum either at the beginning or end of the evaluation or even yearly. Specify in what increments payments should be made and what conditions must be met for payment. Rather than tying payment(s) to certain dates, consider making payment(s)contingent on the completion of certain tasks or deliverables.

Be sure to come to agreement on these terms in collaboration with your evaluator. This is an opportunity to launch your working relationship from a place of open communication and shared expectations.

Newsletter: Data Collection Planning Matrix

Posted on July 1, 2015 by  in Newsletter - ()

The part of your proposal’s evaluation plan that reviewers will probably scrutinize most closely is the data collection plan. Given that the evaluation section of a proposal is typically just 1-2 pages, you have minimal space to communicate a clear plan for gathering evidence of your project’s quality and impact. An efficient way to convey this information is in a matrix format. To help with this task, we’ve created a Data Collection Planning Matrix, available from (bit.ly/data-matrix).

This tool prompts the user to specify the evaluation questions that will serve as the foundation for the evaluation; what indicators1 will be used to answer each evaluation question; how data for each indicator will be collected, from what sources, by whom, and when; and how the data will be analyzed. (The document includes definitions for each of these components to support shared understandings among members of the proposal development team.) Including details about data collection in your proposal shows reviewers that you have been thoughtful and strategic in determining how you will build a body of evidence about the effectiveness and quality of your NSF-funded work. The value of putting this information in a matrix format is that it ensures you have a clear plan for gathering data that will enable you to fully address all the evaluation questions and, conversely, that all the data you plan to collect will serve a specific purpose.

A good rule of thumb is to develop at least one overarching evaluation question for each main element of a project logic model (i.e., activities, outputs, and short-, mid-, and long-term outcomes). Although not required for ATE program proposals, logic models are an efficient way to convey how your project’s activities and products will lead to intended outcomes. The evaluation’s data collection plan should align clearly with your project’s activities and goals, whether you use a logic model or not. If you are interested in developing a logic model for your project and want to learn more, see our ATE Logic Model Template at (bit.ly/ate-logic).

If you have questions about the data collection planning matrix or logic model template or suggestions for improving it, let us know: email us at info@evalu-ate.org.

1 For more on indicators and how to select ones that will serve your evaluation well, see Goldie MacDonald’s checklist, Criteria for Selection of High-Performing Indicators, available from (bit.ly/indicator-eval).

Newsletter: Project Resumes: Simple, Effective Tools for Accountability

Posted on April 1, 2015 by  in Newsletter - ()

Director of Research, The Evaluation Center at Western Michigan University

At EvaluATE’s first-ever National Visiting Committee in 2009, Dr. Nick Smith (NVC chair) recommended that we develop a “project vita” as a convenient way to update the NVC on our center’s activities. He pointed us to a paper he coauthored in which he described the process, uses, and benefits of developing and maintaining a project vita (see bit.ly/project-resume). With this nudge, we developed our first center vita (although we call it a resume now) and have kept it updated and posted on our website ever since. We have long advocated for other ATE projects and centers to develop their own, using ours as a model if they wish (see evalu-ate.org/about/resume). We heartily concur with Smith and Florini’s statement that “few management and evaluation techniques seem as simple and effective as the project vita—surely a tool with those characteristics is worth sharing with a broader professional audience.”

Like your own resume or curriculum vita, a project resume conveys past accomplishments and capacity for future work. As such, directing proposal reviewers to an online resume is a quick way to share evidence of past performance. It also comes in handy at annual reporting time because it lists all major project activities, key personnel, collaborators, and products in one place. Checking your resume is much more efficient than retrospectively documenting a record of a year’s worth of work.

EvaluATE’s Emma Perk has developed a Project Resume Checklist as a step-by-step guide for project PIs (or others) to develop their own project resumes—check it out at bit.ly/resume-checklist and join us at our next webinar on May 13 to learn more (see p. 4)

Newsletter: Secondary Data Resources

Posted on January 1, 2015 by  in Newsletter - () ()

Doctoral Associate, EvaluATE, Western Michigan University

It’s easier than ever to access national-level data that may be useful to ATE projects and centers for planning, benchmarking, or evaluation. A few of these resources are listed below:

The National Center for Education Statistics (nces.ed.gov) Digest of Education Statistics provides information about U.S. students and education institutions. The IPEDS system is a tool that focuses on postsecondary students and . The NCES website also contains a variety of data related to education from K-12 schools and 2- and 4-year colleges.

American FactFinder (factfinder.census.gov) provides access to all variables collected by the U.S. Census and, on a more regular basis, the American Community Survey.

The Bureau of Labor Statistics (bls.gov) provides access to data on employment, earnings, and industry activity.

The National Student Clearinghouse (studentclearinghouse.org) offers a subscription-based service for tracking students and their history. Your institution may already be a subscriber to this service.

College institutional research offices often collect information about students, including demographics, enrollment, course completions, and grades. Building a relationship with this office at your institution could help you access a range of individual student data.

Shameless self-promotion aside, EvaluATE (evalu-ate.org/annual_survey) administers an annual survey to all ATE grants, collecting data that could be used for benchmarking against the ATE community or just within your discipline.

Newsletter: Beyond Reporting: Getting More Value Out of Your Evaluation

Posted on October 1, 2014 by  in Newsletter - ()

Director of Research, The Evaluation Center at Western Michigan University

All ATE projects are required to have an evaluation of their work and it is expected that results will be included in annual reports to NSF. But if that’s all a  project is using its evaluation for, it’s probably not bringing a lot of value to the grant work. In our webinar, The Nuts and Bolts of ATE Evaluation Reporting, we presented ways evaluation results can be used beyond reporting to NSF. In this article, we share ways to use evaluation results for project improvement. For more details on other uses, check out the segment of the webinar at http://bit.ly/webinar-clip.

Using evaluation results to help improve your project requires more than just accepting the evaluator’s recommendations. Project team members should take time to delve into the evaluation data on their own. For example, read every comment in your qualitative data. Although you should avoid getting caught up in the less favorable remarks, they can be a valuable source of information about ways you might improve your work. Take time to consider the remarks that surprise you—they may reveal a blind spot that needs to be investigated. But don’t forget to pat yourself on the back for the stuff you’re already getting right.

Although it’s important to find out if a project is effective overall, it can be very revealing to disaggregate by participant characteristics, such as by gender, age, discipline, enrollment status, or other factors. If you find out that some groups are getting more out of their experience with the project than others, you have an opportunity to adjust what you’re doing to better meet your intended audience’s needs.

The single most important thing you can do to maximize an evaluation’s potential to bring value to your project is to make time to meet with your evaluator, review results with your project colleagues and advisors, and make decisions about how to move forward based on findings. ATE grantees are awarded about $60 million annually by the federal government. We have an ethical obligation to be self-critical, use all available information sources to assess progress and opportunities for improvement, and utilize project evaluations to help us achieve excellence in all aspects of our work.

Newsletter: Tools to Prepare a Data Management Plan for NSF

Posted on July 1, 2014 by  in Newsletter - ()

EvaluATE Blog Editor

NSF requires that ALL proposals include a data management plan (DMP); FastLane will not accept submissions without one. The DMP must detail “how you will conform to NSF policy on the dissemination and sharing of research results.” The term “research results” basically means any information collected or produced as a result of your program. Therefore, the DMP must detail what data you will collect and how you will collect, maintain, report, and disseminate those data, as well as other resources generated by your grant. While NSF does outline requirements for what should be included in a DMP (bit.ly/dmp-ehr), they do not tell you how to write one. There are a handful of resources that can help you write a DMP.

The University of Wisconsin Research Data Services Unit has a webpage that provides several links to resources (http://researchdata.wisc.edu/), and the University of Michigan features extensive guidance, including templates and worksheets (bit.ly/um-dmp). The University of Minnesota also offers several resources for DMP development (bit.ly/umn-dmp).

One other tool that can be helpful is the DMP Tool available at DMPTool.org. You fill out the plan as you go through the tool, and you can save plans as well. The tool provides extensive guidance on DMP development, with instructions for each part of the plan, guidance on how to fill out the sections, and helpful links. ATE Central includes guidance, resources, and an example plan in their handbook, available at atecentral.net/handbook, and also provides archive services for resources produced by ATE projects and centers (which supports sustainability). A new requirement in the 2014 ATE program solicitation is that grantees “must provide copies of [their] resources to ATE Central for archiving purposes.”

If you can demonstrate that you followed the data management plan for a prior grant, and also that you have provided access to the information and resources that your project or center has generated, then you can even use this information in your Results of Prior Support section for your next proposal.

Newsletter: Stakeholder Engagement: What Does it Mean for Your Project?

Posted on April 1, 2014 by  in Newsletter - ()

Director of Research, The Evaluation Center at Western Michigan University

Who are the stakeholders in your project evaluation? How should they be engaged in the evaluation?  An evaluation stakeholder is anyone who is involved in or affected by a project or its evaluation —from the student experiencing a new curriculum to an NSF program officer monitoring a project’s progress. Engagement can be anything from serving as an information source for the evaluation to participating in data interpretation and recommendation development. With such broad definitions, it can be difficult to figure out the right mix of whom to involve in evaluation activities and how.

We have created a new resource to support reflection and decision making around this issue. The Identifying Stakeholders and their Role in an Evaluation worksheet presents a series of prompts to help PIs and evaluators move from thinking generically about stakeholder engagement to identifying specific individuals and the type of involvement best suited to them.

Involving stakeholders in a project’s evaluation has many benefits. For example, when stakeholders are engaged in various aspects of an evaluation, it usually increases the evaluation’s relevance and usefulness to the project. When key stakeholders demonstrate support for the evaluation, it may enhance cooperation with data collection. Stakeholders’ knowledge of a project’s context and content typically exceeds that of an external evaluator; that knowledge can be tapped for myriad purposes throughout an evaluation. But stakeholder engagement is not a one-size-fits-all activity. It’s not necessary—and rarely feasible—to involve all stakeholders to the same degree in an evaluation. Maybe some just need to be kept abreast of evaluation activities, while others should take a more active role in decision making. The worksheet is intended to help you figure out what stakeholder engagement should look like in your project.

Click on the link to download the Identifying Stakeholders and their Role in an Evaluation worksheet.