Sorry, no biography is available for Gerhard Salinger.

Webinar: Your ATE Proposal: Got Evaluation? (8/26/14)

Posted on August 26, 2014 by , , , , in Webinars ()

Presenter(s): Asa Bradley, Gerhard Salinger, Krystin Martens, Lori Wingate, Terryll Bailey
Date(s): August 26, 2014
Time: 1:00 p.m. EDT
Recording: http://youtu.be/kpn1XtvVQ_A

A strong evaluation plan that is well integrated into your grant proposal will strengthen your submission and maybe even give you a competitive edge. In this webinar, long-time members of the ATE community will provide insights on ways to enhance your proposal and avoid common pitfalls with regard to evaluation. We’ll walk through EvaluATE’s Evaluation Planning Checklist for ATE Proposals, which provides detailed guidance on how to address evaluation throughout a proposal—from the project summary to the budget justification. We’ll share examples of how to incorporate results from previous evaluations in the Results of Prior NSF Support section, present a coherent evaluation plan linked to project activities and goals, and budget for an external evaluation (among other things). We’ll have plenty of time for questions and discussion with our knowledgeable and experienced panel.

Resources:
Slide PDF
Evaluation Planning Checklist for ATE Proposals
Tools to Prepare a Data Management Plan for NSF
ATE Central Video

Webinar: Your ATE Proposal: Got Evaluation? (8/20/14)

Posted on August 20, 2014 by , , , , in Webinars ()

Presenter(s): Asa Bradley, Gerhard Salinger, Krystin Martens, Lori Wingate, Terryll Bailey
Date(s): August 20, 2014
Time: 1:00 PM
Recording: http://youtu.be/aGU9fgIOfYk

A strong evaluation plan that is well integrated into your grant proposal will strengthen your submission and maybe even give you a competitive edge. In this webinar, long-time members of the ATE community will provide insights on ways to enhance your proposal and avoid common pitfalls with regard to evaluation. We’ll walk through EvaluATE’s Evaluation Planning Checklist for ATE Proposals, which provides detailed guidance on how to address evaluation throughout a proposal—from the project summary to the budget justification. We’ll share examples of how to incorporate results from previous evaluations in the Results of Prior NSF Support section, present a coherent evaluation plan linked to project activities and goals, and budget for an external evaluation (among other things). We’ll have plenty of time for questions and discussion with our knowledgeable and experienced panel.

Resources:
Slide PDF
Evaluation Planning Checklist for ATE Proposals
Tools to Prepare a Data Management Plan for NSF
ATE Central Video

Newsletter: 20 Years of ATE Evaluation

Posted on October 1, 2013 by  in Newsletter - ()

Evaluation has been required of ATE projects and centers since the program began in 1993. Many evaluations were concerned more with numbers of students and faculty impacted, rather than the effectiveness of the intervention. The sophistication of evaluation expectations has been increasing over time. Early in the program, there was a shortage of evaluators who understood both the disciplinary content and the methods of evaluation. Through a separate grant, Arlen Gullickson at the Western Michigan University Evaluation Center provided an internship program for novice evaluators who spent six months evaluating a component of an ATE project. Several ATE evaluators got their start in this program, and several PIs learned what evaluation could do for them and their projects.

The ATE program responded to the Government Performance Results Act by developing a project monitoring survey that provided a snapshot of the program. The survey is still administered annually by EvaluATE (see p. 3). Although this monitoring system also emphasized “body counts,” as time went on the survey was modified with the input of program officers to include questions that encouraged evaluation of project effectiveness.

For example, questions were asked if the project’s evaluation investigated the extent to which the participants in professional development actually implemented the content correctly and the resulting impact on student learning, following the ideas of the Kirkpatrick model for evaluation. The evaluations reported in renewal proposals still concentrate on “body counts.” Proposal reviewers ask, “What happened as a result?” To develop project evaluations that could be aggregated to determine how the ATE program was meeting its goals, a workshop was held with evaluators from centers. The participants suggested that projects could be evaluated along eight dimensions: impact on students, faculty, the college, the community, industry, interaction among colleges, the region, and the nation. A review of several project and center annual reports found that all categories were addressed, and very few items could not be accommodated in this scheme.

Following the evaluation in NSF’s Math Science Partnerships program, I have encouraged project and center leaders to make a FEW claims about the effectiveness of their projects. The evaluator should provide evidence for the extent to which the claims are justified. This view is consistent with the annual report template in Research.gov, which asks for the major goals of the project. It also limits summative evaluation to a few major issues. Much of the emphasis, both here and in general, has been on summative evaluation focused on impact and effectiveness. Projects should also be engaged in formative evaluation to inform project improvements. This requires a short feedback cycle that is usually not possible with only external evaluation. An internal evaluator working with an external evaluator may be useful for collecting data and providing timely feedback to the project. A grant has recently been awarded to strengthen the practice and use for formative evaluation by ATE grantees. Arlen Gullickson, EvaluATE’s co-PI, is leading this work, in cooperation with EvaluATE.

Gerhard Salinger is a founding program officer of the ATE program. The ideas expressed here are his alone and may not reflect the views of the National Science Foundation.

Webinar: Claims + Evidence: Assessing ATE Grant Outcomes

Posted on March 16, 2011 by , , , , in Webinars ()

Presenter(s): Gerhard Salinger, Judith Monsaas, Lori Wingate, Mark Viquesney, Stephanie Evergreen
Date(s): March 16, 2011
Recording: https://vimeo.com/21245222

The 2010 ATE program solicitation says that PIs “should establish claims as to the project’s effectiveness, and the evaluative activities should provide evidence on the extent to which the claims are realized.” This webinar will walk ATE evaluators and PIs through a five-step process, which includes
-identifying claims worthy of evaluative investigation
-defining how to measure impact in meaningful, yet practical ways
-determining how to make a strong case that the ATE project caused the observed impact
-setting up performance standards to aid in interpreting evaluation results

Resources:
Slide PDF
Handout PDF