Arlen Gullickson

Co-Principal Investigator, EvaluATE – Western Michigan University

One of four children, Arlen Gullickson was born and raised in a farming family in the state of Iowa. His education includes baccalaureate, masters, and Ph.D. degrees in mathematics, physics and education, respectively. He has 30 years of teaching experience at the high school and college levels and altogether more than 40 years of experience working in education. In the past, Arlen was the director of The Evaluation Center and Chair of the Joint Committee on Standards for Educational Evaluation. Currently, he is supposed to be retired. But he serves as a Co-Principal Investigator for EvaluATE (after serving as the PI) and fishes whenever he can.


Blog: Strengthening Post Hoc Professional Development Evaluations

Posted on February 11, 2015 by  in Blog ()

Co-Principal Investigator, EvaluATE – Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I am an educator; I’ve taught and worked in the field for more than 50 years. In recent years, much of my work has centered on dissemination of evaluation information to serve educators engaged in classroom teaching, as well as those evaluating education programs. My comments here pertain to professional development intended to enhance classroom instruction.

Results from EvaluATE’s annual surveys of ATE program grantees indicate that professional development (PD) providers within ATE do evaluate some aspects of their PD programs. However, most do not follow up to assess gains among their participants’ students. Our reflections and discussions with PIs and evaluators suggest a good reason for this shortcoming. It is costly in time and effort to do a post hoc evaluation with participants, and PIs cannot easily gain access to information about the students of PD participants. Also, the strictures on sharing student interest and achievement information are substantial. So an important question is, how can we manage our PD and evaluation to overcome these hurdles?

I think an important part of the answer involves engaging participant teachers in the assessment and evaluation processes. Such engagement requires willingness on their part, preparation and practice to develop the knowledge and skill adequate to do the work, support and encouragement to do this work, and follow up exchanges of feedback about individual and collective effects.

Here are five practices that I believe are associated with strong evaluations of PD programs intended to enhance classroom instruction. How many of these practices are part of your PD efforts? If you do not currently take these actions, give them a try. I’d appreciate your thoughts and suggestions once you try them.

At the time participants are recruited, they agree to provide post-PD feedback on:
1. the impact of the PD on their own instruction.
2. the impact of the PD on their students’ learning.

During the PD program
3. participants demonstrate what they learned during the training (not including self-report).
4. participants receive instruction on student assessment.
5. participants are provided tools, protocols etc. for both gathering and reporting information on student impacts.

Report: Assessing the Impact and Effectiveness of the ATE Program

Posted on October 9, 2014 by , , , in

This report was prepared as an analysis of the first annual status report. It was meant to help understand the ATE program, and make preparations for upcoming surveys and site visits.

File: Click Here
Type: Report
Category: ATE Research & Evaluation
Author(s): Arlen Gullickson, Frances Lawrenz, Gloria Tressler, Sharon Barbour

Report: ATE Program Evaluation: Contributors and inhibitors influencing program improvement

Posted on October 8, 2014 by , , in

This brief focuses on project/center evaluation and is divided into 4 sections. This section, Section 1, provides an overview of ATE expectations for evaluation and principal investigators’ responses that describe how they meet those requirements—who conducts the evaluations, how much money is spent on evaluations, and the extent to which these evaluations vary by characteristics such as the type of grant and type of evaluator conducting the evaluation. Section 2 describes PI perceptions of the utility of their evaluations and the extent to which PI perceptions of utility are related to the evaluation characteristics described in Section 1. Section 3 focuses on the activities of external evaluators — PI satisfaction with these evaluators, the relationship between PI ratings and standards for sound program evaluations, whether the PIs view their evaluations as meeting ATE intellectual merit requirements, and PIs’ characterizations of the attributes of their external evaluators. Section 4 draws together findings reported in Sections 1 to 3 to identify strengths and weaknesses of project-level evaluations and to suggest changes that appear likely to improve on current evaluation practices.

File: Click Here
Type: Report
Category: ATE Research & Evaluation
Author(s): Arlen Gullickson, Chris Coryn, Liesel Ritchie

Report: ATE Program Evaluation: Project level evaluation practices

Posted on October 8, 2014 by , , in

This study analyzes project-level evaluation practices occurring in the Advanced Technological Education program of the National Science Foundation. Of special interest in this study were factors thought to affect the quality and utility of evaluations such as the cost of evaluations, who engaged in evaluation planning, and the use of external evaluators. The ATE program requires project-level evaluations and provides guidelines regarding what evaluations can and should do. The report closes with a discussion of discrepancies between expectations and project level actions and the apparent strengths and weaknesses of project evaluations. Suggestions are offered on how to improve these evaluation practices.

File: Click Here
Type: Report
Category: ATE Research & Evaluation
Author(s): Arlen Gullickson, Chris Coryn, Liesel Ritchie

Report: Assessing the impact and effectiveness of the ATE program Volume 3 (2004)

Posted on October 8, 2014 by , , in

This report, Volume 3 of the 2004 Annual Survey Report, focuses on the following fundamental elements of the ATE program:

1) What is the size and scope of work for ATE projects?

2) To what degree do ATE projects apply rigorous internal practices in their operations?

3) How extensive are ATE project collaborations?

4) How productive are ATE projects in terms of the primary ATE work categories?

5) What impact are ATE projects having on students?

File: Click Here
Type: Report
Category: ATE Research & Evaluation
Author(s): Arlen Gullickson, Carl Hanssen, Chris Coryn