Many Advanced Technological Education (ATE) grants hold professional development events for faculty. As the lead for several ATE grants, I have been concerned that while data obtained from faculty surveys immediately after these events are useful, they do not gauge the impact of the training on students. The National Convergence Technology Center’s (CTC) approach, described below, uses longitudinal survey data from the faculty attendees to begin to provide evidence on student impact. I believe the approach is applicable to any discipline.
The CTC provides information technology faculty with a free intensive professional development event titled Working Connections Faculty Development Institute. The institute is held twice per year—five days in the summer and two and a half days in the winter. Working Connections helps faculty members develop the skills needed to create a new course or to perform major updates on an existing course in the summer and enough to update a course in the winter. Over the years, more than 1,700 faculty have enrolled in the training. From the beginning, we have gathered attendee feedback via two surveys at the end of the event. One survey focuses on the specific topic track, asking about the extent to which attendees feel that their three learning outcomes were mastered, as well as information on the instructor’s pacing, classroom handling, etc. The other survey asks questions about the overall event, including attendees’ reactions to the focused lunch programs, and how many new courses have been created or enhanced as a result of past attendance.
The CTC educates faculty members as a vehicle for educating students. To learn how the training impacts students and programs, we also send out longitudinal surveys at 6, 18, 30, 42, and 54 months after each summer Working Connections training. These surveys ask faculty members to report on what they did with what they learned at each training, including how many students they educated as a result of what they learned. Faculty are also asked to report how many certificates and degrees were created or enhanced. Each Working Connections cohort receives a separate survey invitation (i.e., someone who attended two Working Connections will get two separate invitations) that includes a link to the survey as well as a roster to help attendees remember which track they took that year. Participation is voluntary, but over the years, we have consistently and strongly emphasized the importance of getting this longitudinal data so that we can provide some evidence of student impact to the National Science Foundation. Our response rate from surveys sent in January 2016 was 59%.
Responses from surveys from 2008-2016 indicate the following:
|Number of students who were taught the subjects trained
|Number of new/enhanced courses
|Number of sections taught
|Number of new/enhanced certificates and degrees
While these data still do not allow us to know how the students themselves consumed the attendees’ learning, it does provide evidence that is one step closer to obtaining student impact than just counting faculty feedback after each training. We are considering what else we can do to further unpack the impact on students, but the Family Educational Rights and Privacy Act’s (FERPA) limitations stand in the way of the CTC contacting affected students directly without their permission.
Tip: It is mandatory that a longitudinal survey effort be intentional and consistent. Further, it is extremely important to consistently promote the need for attendees to fill out surveys both during the professional development events and via emails preceding the annual survey emails. It is all too easy for attendees to simply delete the longitudinal survey if they do not see the point of filling them out.