Newsletter - Spring 2015

Newsletter: Why Does the NSF Worry about Project/Center Evaluation?

Posted on April 1, 2015 by  in Newsletter - ()

Lead Program Director, ATE, National Science Foundation

I often use a quick set of questions that Dr. Gerhard Salinger developed in response to the question, “How do you develop an excellent proposal?” Question 4 is especially relevant to the issue of project/center evaluation:

  1. What is the need that will be addressed?
  2. How do you specifically plan to address this need?
  3. Does your project team have the necessary expertise to carry out your plan?
  4. How will you know if you succeed?
  5. How will you tell other people about the results and outcomes?

Question 4 is addressing the evaluation activities of a project or center, and I hope you consider it essential for conducting an effective and successful project. Formative assessment guides you and lets you know if your strategy is working; it gives you the information to shift strategies if needed. A summative assessment then provides you and others with information on the overall project goals and objectives. Evaluation adds the concept of value to your project. For example, the evaluation activities might provide you with information on the participants’ perceived value of the workshop, and follow-on evaluation activities might provide you with information as to how many faculty used what they learned in a course. A final step might be to evaluate the impact on student learning in the course following the course change.

As a program officer, I can quickly scan the project facts (e.g., how many of this or that), but I tend to spend much more time on the evaluation data as it provides the value component to your project activities. Let’s go back to the faculty professional development workshops. Program officers definitely want to know if the workshops were held and how many people attended, but it is essential to provide information on the value of the workshops. It’s great to know that faculty “liked” the workshop, but of greater importance is the impact on their teaching practices and student learning that occurred due to the change. Your annual reports (yes, we do read them carefully) can provide the entire evaluation report as an attachment, but it would be really helpful if you, the PI, provided an overview of what you see as your project value added within the body of the report.

There are several reasons evaluation information is important to NSF program officers. First, each federal dollar that you expend carrying out your project is one that the taxpayers expect both you and the NSF to be accountable for. Second, within the NSF, program portfolios are scrutinized to determine programmatic impact and effectiveness. Third, the ATE program is congressionally mandated and program data and evaluation are often used to respond to congressional questions. Put more concisely, NSF wants to know if the investment in your project/center was a wise one and if value was generated from this investment.

Newsletter: Survey Says

Posted on April 1, 2015 by  in Newsletter - ()

Doctoral Associate, EvaluATE, Western Michigan University

Each year, ATE PIs are asked what type of reports their evaluators provide them with and how they use the information. The majority of ATE PIs receive both oral and written reports from their evaluators.

Picture2 Picture1

PIs who receive reports in both oral and written forms report higher rates of evaluation use, as shown in the figure on the right, above.

You can find more at evalu-ate.org/annual_survey/

Newsletter: Dashboards

Posted on April 1, 2015 by  in Newsletter - ()

EvaluATE Blog Editor

Dashboards are a way to present data about the “trends of an organization’s key performance indicators.”1 Dashboards are designed to provide information to decision makers about important trends and outcomes related to key program activities in real time. Think of a car’s dashboard. It gives you information about the amount of gas the car has, the condition of the engine, and the speed—all of which allow you to pay more attention to what is going on around you. Dashboards optimally work by combining data from a number of sources into one document (or web page) that is focused on giving the user the “big picture,” and keeping them from getting lost in the details. For example, a single dashboard could present data on event attendance, participant demographics, web analytics, and student outcomes, which can give the user important information about project reach, as well as potential avenues for growth.

As a project or center’s complexity increases, it’s easy to lose sight of the big picture. By using a dashboard that is designed to integrate many pieces of information about the project or center, staff and stakeholders can make well-balanced decisions and can see the results of their work in a more tangible way. Evaluators can also take periodic readings from the dashboard to inform their own work, providing formative feedback to support good decisions.

For some real-world examples, check out bit.ly/db-examples

1 bit.ly/what-is-db

Newsletter: How is an NSF Project Outcomes Report Different from a Final Annual Report?

Posted on April 1, 2015 by  in Newsletter - ()

Director of Research, The Evaluation Center at Western Michigan University

All NSF projects awarded in January 2010 or later are required to submit a project outcomes report within 90 days of the grant’s expiration, along with a final annual report. In addition to the fact that a project outcomes report is a few paragraphs (200-800 words) and annual reports are typically several pages long, there are three other ways a project outcomes report is distinct from a final annual report.

1. A project outcomes report is solely about outcomes. A final annual report addresses many other topics. Project outcomes reports should describe what a project developed and the changes it brought about with regard to advancing knowledge (intellectual merit) and contributing to desired social outcomes (broader impacts). The focus should be on products and results, not project implementation. Publications are important evidence of intellectual merit, and a list of publications will be generated automatically from the project’s annual reports submitted to Research.gov. Other products generated with grant funds should be listed, such as data sets, software, or educational materials. If these products are available online, links may be provided.1 An accounting of grant products demonstrates a project’s productivity and intellectual merit. To address the project’s broader impacts, reports should highlight achievements in areas such as increasing participation in STEM by underrepresented minorities, improving teaching and learning, and developing the technical workforce.

2. A project outcomes report provides a “complete picture of the results” of a project.2 A final annual report covers the last year of the project only. A project outcomes report is not a progress report. It is the final word on what a project achieved and produced. PIs should think carefully about how they want their work to be portrayed to the public for decades to come and craft their reports accordingly. Dr. Joan Strassman of Washington University provides this cogent advice about crafting outcomes reports:

[A project outcomes report] is where someone … can go to see where NSF is spending its tax dollars. This document is not the plan, not the hopes, but the actual outcomes, so this potential reader can get direct information on what the researcher says she did. It pulls up along with the original funding abstracts, so see to it they coordinate as much as possible. Work hard to be clear, accurate, and compelling. (Read more at bit.ly/blog-POR)

3. A project outcomes report is a public document.3 A final annual report goes to the project’s NSF program officer only. A big difference between these audiences is that a project’s program officer probably has expertise in the project’s content area and is certainly familiar with the overall aims of the program through which the project was funded. For the benefit of lay readers, project outcomes report authors should use plain language to ensure comprehension by the general public (see plainlanguage.gov). Authors may check the report’s readability by having a colleague from outside the project’s content area review it. It’s important to include complete, yet succinct documentation that is readily understandable by individuals outside the project’s content area.

1 ATE grants awarded in 2014 or later are required to archive their materials with ATE Central.

2 For NSF’s guidelines regarding project outcomes reports, see bit.ly/POR-FAQs.

3 To access ATE project outcomes reports: (1) Go to bit.ly/NSF-POR (2) Enter “ATE” in the keyword box; (3) Check the box for “Show Only Awards with Project Outcomes Reports.”

Newsletter: Project Resumes: Simple, Effective Tools for Accountability

Posted on April 1, 2015 by  in Newsletter - ()

Director of Research, The Evaluation Center at Western Michigan University

At EvaluATE’s first-ever National Visiting Committee in 2009, Dr. Nick Smith (NVC chair) recommended that we develop a “project vita” as a convenient way to update the NVC on our center’s activities. He pointed us to a paper he coauthored in which he described the process, uses, and benefits of developing and maintaining a project vita (see bit.ly/project-resume). With this nudge, we developed our first center vita (although we call it a resume now) and have kept it updated and posted on our website ever since. We have long advocated for other ATE projects and centers to develop their own, using ours as a model if they wish (see evalu-ate.org/about/resume). We heartily concur with Smith and Florini’s statement that “few management and evaluation techniques seem as simple and effective as the project vita—surely a tool with those characteristics is worth sharing with a broader professional audience.”

Like your own resume or curriculum vita, a project resume conveys past accomplishments and capacity for future work. As such, directing proposal reviewers to an online resume is a quick way to share evidence of past performance. It also comes in handy at annual reporting time because it lists all major project activities, key personnel, collaborators, and products in one place. Checking your resume is much more efficient than retrospectively documenting a record of a year’s worth of work.

EvaluATE’s Emma Perk has developed a Project Resume Checklist as a step-by-step guide for project PIs (or others) to develop their own project resumes—check it out at bit.ly/resume-checklist and join us at our next webinar on May 13 to learn more (see p. 4)

Newsletter: Project Spotlight: MatEdu

Posted on April 1, 2015 by  in Newsletter - ()

Principal Investigator, MatEdu, Edmonds Community College

Mel Cossette is principal investigator for the National Resource Center for Materials Technology Education at Edmonds Community College. MatEdU’s mission is to advance materials technology education nationally. 

Q: What advice would you give to a PI in their first reporting period?

A: First, check the Reporting section of Research.gov to confirm the annual report due date. Sometimes a first time PI refers to their award date or start date, but it’s actually the due date listed on this website that is critical. Second, connect with your evaluator and inform him or her of the report due date. This helps with the planning and writing processes and assists with identifying information to be shared early on in the process. This does not mean things cannot change, but it is essential that the evaluator and PI communicate.

Q: How do you use your evaluation results in your annual report to NSF?

A: Typically, we create a rough draft of the annual report, from our perspective, which we share with our evaluator. The evaluator reviews and provides feedback. In the meantime, we continue building our report, paying attention to the different categories within the report, such as accomplishments, significant activities, products developed, etc. During this time, our evaluator develops a draft report that is shared with us. Although the reports have a different focus and are written using different formats, we compare content from the two reports. That helps us to be succinct with the data and information the reports are requiring. We find that this collaborative process helps to keep our team focused on the task at hand.

Q: What are some things that make an evaluation report useful (from a PI’s perspective)?

A: Because the information is coming from a semi-external perspective, we get the chance to compare the evaluation report on our activities, successes, areas that may need review, etc., to our activity timeline. This helps to limit scope creep. The recommendations from our evaluator also enabled us to identify a potential gap in our activities that needs to be addressed. PIs are usually completely focused on their projects and annual reports, so having an external evaluator point out successes, gaps, inconsistencies and data points reinforces progress and project direction.