Archive: ATE

Blog: Successful Practices in ATE Evaluation Planning

Posted on July 19, 2018 by  in Blog ()

President, Mullins Consulting, Inc.

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

In this essay, I identify what helps me create a strong evaluation plan when working with new Advanced Technological Education (ATE) program partners. I hope my notes add value to current and future proposal-writing conversations.

Become involved as early as possible in the proposal-planning process. With ATE projects, as with most evaluation projects, the sooner an evaluator is included in the project planning, the better. Even if the evaluator just observes the initial planning meetings, their involvement helps them become familiar with the project’s framework, the community partnerships, and the way in which project objectives are taking shape. Such involvement also helps familiarize the evaluator with the language used to frame project components and the new or established relationships expected for project implementation.

Get to know your existing and anticipated partners. Establishing or strengthening partnerships is a core component of ATE planning, as ATE projects often engage with multiple institutions through the creation of new certifications, development of new industry partnerships, and explanation of outreach efforts in public schools. The evaluator should take detailed notes on the internal and external partnerships involved with the project. Sometimes, to support my own understanding as an evaluator, it helps for me to visually map these relationships. Also, the evaluator should prepare for the unexpected. Sometimes, partners will change during the planning process as partner roles and program purposes become more clearly defined.

Integrate evaluation thinking into conversations early on. Once the team gets through the first couple of proposal drafts, it helps if the evaluator creates an evaluation plan and the team makes time to review it as a group. This will help the planning team clarify the evaluation questions to be addressed and outcomes to be measured. This review also allows the team to see how their outcomes can be clearly attached to program activities and measured through specific methods of data collection. Sometimes during this process, I speak up if a component could use further discussion (e.g., cohort size, mentoring practices). If an evaluator has been engaged from the beginning and has gotten to know the partners, they have likely built the trust necessary to add value to the discussion of the proposal’s central components.

Operate as an illuminator. A colleague I admire once suggested that evaluation be used as a flashlight, not as a hammer. This perspective of prioritizing exploration and illumination over determination of cause and effect has informed my work. Useful evaluations certainly require sound evaluation methodology, but they also require the crafting of results into compelling stories, told with data guiding the way. This requires working with others as interpretations unfold, discovering how findings can be communicated to different audiences, and listening to what stakeholders need to move their initiatives forward.

ATE programs offer participants critical opportunities to be a part of our country’s future workforce. Stakeholders are passionate about their programs. Careful, thoughtful engagement throughout the proposal-writing process builds trust while contributing to a quality proposal with a strong evaluation plan.

Newsletter: Survey Says Summer 2015

Posted on July 1, 2015 by  in Newsletter - ()

Doctoral Associate, EvaluATE, Western Michigan University

On average, ATE grantees spend 7 percent of their budgets on evaluation. Smaller projects spend smaller proportions of their awards on evaluation than larger projects. In this figure, grants are split into quartiles by the size of their annual budgets and the average budget allocation for evaluation is shown for each quartile.

2015-Summer-Survey

 

 

 

 

 

 

 

 

 

For more ATE survey findings, visit www.evalu-ate.org/annual_survey.

Report: An Exploratory Test of a Model for Enhancing the Sustainability of NSF’s Advanced Technological Education (ATE) Program

Posted on February 25, 2015 by  in Resources ()

The purpose of this research is to examine the effectiveness of a model that purports to improve the sustainability of ATE projects and centers. According to Lawrenz, Keiser, & Lavoie (2003), several models for sustainability have been proposed in the organizational change literature. However, for the most part, the models are advocacy statements based on author experience rather than on empirical studies. These authors concluded there was little research directly related to sustainability.

File: Click Here
Type: Report
Category: ATE Research & Evaluation
Author(s): Wayne Welch

Blog: Reflections on the 2014 ATE PI Conference

Posted on November 6, 2014 by , , in Blog
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Well, the 2014 ATE PI conference has come and gone. First, let us say thank you to AACC for again hosting a wonderful conference. It was great to meet and share ideas with such an amazing group of professionals. We truly look forward to this conference every year. This blog entry contains the EvaluATE team’s reflections on our personal highlights from this year’s conference and hopes for next year.

Jason

Robots! Guitars! Technology in action! My biggest highlight was getting to see the variety of work that is being done within the ATE program. It can be easy in our day-to-day work to forget how important the work of the ATE program is, but we are truly at the forefront of technological education in the United States. I also enjoy getting to see people that I know from within the program. I hope that next year continues to see even more new amazing tech bits!

Lori

For me, the highlight of the conference was when ATE Program Co-Lead David Campbell quoted EvaluATE’s Emma Perk to a room of 100+ people: “The most important purpose of evaluation is not to prove, but to improve.” She shared this quotation from Daniel Stufflebeam in her portion of the Getting Started workshop the day before. (View Emma and Jason’s Getting Started slides). At next year’s conference, I hope there will be more presentations in the research and evaluation conference track. Participants in the preconference workshop on evaluation appreciated hearing about real-world evaluations and practical tips from seasoned ATE evaluators.  We need more of this at every ATE PI conference! (Check out the workshop slides by Candiya Mann, Amy Nisselle, and Bruce Nash).

Emma

This year was my first time attending the ATE PI conference. The showcase sessions were the highlight of the conference for me. I really enjoyed interacting with the different PIs and staff from all the projects and centers. It was great to learn more about the ATE community and how we can expand on what we offer to them as a resource center. EvaluATE’s showcase booth was situated between ATE Central and Mentor-Connect, so we were able to reinforce our great relationship with them and refer people to their useful resources. My hope for next year is to do an evaluation session or roundtable, focusing on identifying the needs of the ATE community.

Corey

Unfortunately I was unable to be at the ATE conference this year. I missed the opportunity to put faces to names. As the annual survey coordinator, I communicate with many of you over the course of the year, so it’s nice to meet some of you face-to-face at the conference. I enjoy being able to talk in person with individuals about the ATE annual survey, to hear concerns, listen to suggestions, and talk data. If you didn’t see the latest reports based on the 2014 survey—like our data snapshots on the representation of women and underrepresented minorities in ATE—check them out here: http://www.evalu-ate.org/annual_survey/

 

We look forward to seeing you all at the conference next year! For more highlights from this year’s conference, including pictures please visit our

Newsletter: 20 Years of ATE Evaluation

Posted on October 1, 2013 by  in Newsletter - ()

Evaluation has been required of ATE projects and centers since the program began in 1993. Many evaluations were concerned more with numbers of students and faculty impacted, rather than the effectiveness of the intervention. The sophistication of evaluation expectations has been increasing over time. Early in the program, there was a shortage of evaluators who understood both the disciplinary content and the methods of evaluation. Through a separate grant, Arlen Gullickson at the Western Michigan University Evaluation Center provided an internship program for novice evaluators who spent six months evaluating a component of an ATE project. Several ATE evaluators got their start in this program, and several PIs learned what evaluation could do for them and their projects.

The ATE program responded to the Government Performance Results Act by developing a project monitoring survey that provided a snapshot of the program. The survey is still administered annually by EvaluATE (see p. 3). Although this monitoring system also emphasized “body counts,” as time went on the survey was modified with the input of program officers to include questions that encouraged evaluation of project effectiveness.

For example, questions were asked if the project’s evaluation investigated the extent to which the participants in professional development actually implemented the content correctly and the resulting impact on student learning, following the ideas of the Kirkpatrick model for evaluation. The evaluations reported in renewal proposals still concentrate on “body counts.” Proposal reviewers ask, “What happened as a result?” To develop project evaluations that could be aggregated to determine how the ATE program was meeting its goals, a workshop was held with evaluators from centers. The participants suggested that projects could be evaluated along eight dimensions: impact on students, faculty, the college, the community, industry, interaction among colleges, the region, and the nation. A review of several project and center annual reports found that all categories were addressed, and very few items could not be accommodated in this scheme.

Following the evaluation in NSF’s Math Science Partnerships program, I have encouraged project and center leaders to make a FEW claims about the effectiveness of their projects. The evaluator should provide evidence for the extent to which the claims are justified. This view is consistent with the annual report template in Research.gov, which asks for the major goals of the project. It also limits summative evaluation to a few major issues. Much of the emphasis, both here and in general, has been on summative evaluation focused on impact and effectiveness. Projects should also be engaged in formative evaluation to inform project improvements. This requires a short feedback cycle that is usually not possible with only external evaluation. An internal evaluator working with an external evaluator may be useful for collecting data and providing timely feedback to the project. A grant has recently been awarded to strengthen the practice and use for formative evaluation by ATE grantees. Arlen Gullickson, EvaluATE’s co-PI, is leading this work, in cooperation with EvaluATE.

Gerhard Salinger is a founding program officer of the ATE program. The ideas expressed here are his alone and may not reflect the views of the National Science Foundation.

Newsletter: ATE Sustainability

Posted on October 1, 2013 by  in Newsletter - ()

Sustainability is about ensuring that at least some aspects of a project or center’s work—such as faculty positions, partnerships, or curricula—have “a life beyond ATE funding” (nsf.gov/ate). By definition, sustainability “happens” after NSF funding ends—and thus, after the project or center’s evaluation has concluded. So how can sustainability be addressed in an evaluation? There are three sources of information that can help with a prospective assessment of sustainability, whether for external evaluation purposes or to support project planning and implementation:

(1) Every ATE proposal is supposed to include a sustainability plan that describes what aspects of the grant will be sustained beyond the funding period and how. (2) Every proposal submitted in 2012 or later required a data management plan. This plan should have described how the project’s data and other products would be preserved and made available to others. Both the sustainability and data management plans should be reviewed to determine if the project will be able to deliver on what was promised. (3) Developed by Wayne Welch, the Checklist for Assessing the Sustainability of ATE Projects and Centers can be used to determine a project’s strengths and weaknesses in regard to sustainability. The checklist addresses diverse dimensions of sustainability related to program content and delivery, collaboration, materials, facilities, revenue, and other issues. See bit.ly/18l2Fcb.