Director of Evaluation & Assessment, NC State Industry Expansion Solutions
John Dorris is the Director of Evaluation & Assessment at NC State Industry Expansion Solutions. He provides leadership on strategy, research, and data analysis efforts related to workforce development and engagement, including organizational learning and emerging trends in evaluation. He also manages the development of grant concepts, oversees proposal development and designs and implements program evaluation projects. Dr. Dorris holds an Ed.D. from the University of Tennessee, as well as master’s degrees in business administration and statistics, both from Pennsylvania State University.
Director of Evaluation and Assessment, NC State Industry Expansion Solutions
Assistant Director of Research Development and Evaluation, NC State Industry Expansion Solutions
Designing a rigorous and informative evaluation depends on communication with program staff to understand planned activities and how those activities relate to the program sponsor’s objectives and the evaluation questions that reflect those objectives (see white paper related to communication). At NC State Industry Expansion Solutions, we have worked long enough on evaluation projects to know that such communication is not always easy because program staff and the program sponsor often look at the program from two different perspectives: The program staff focus on work plan activities (WPAs), while the program sponsor may be more focused on the evaluation questions (EQs). So, to help facilitate communication at the beginning of the evaluation project and assist in the design and implementation, we developed a simple matrix technique to link the WPAs and the EQs (see below).
Click to enlarge
For each of the WPAs, we link one or more EQs and indicate what types of data collection events will take place during the evaluation. During project planning and management, the crosswalk of WPAs and EQs will be used to plan out qualitative and quantitative data collection events.
Click to enlarge
The above framework may be more helpful with the formative assessment (process questions and activities). However, it can also enrich the knowledge gained by the participant outcomes analysis in the summative evaluation in the following ways:
Understanding how the program has been implemented will help determine fidelity to the program as planned, which will help determine the degree to which participant outcomes can be attributed to the program design.
Details on program implementation that are gathered during the formative assessment, when combined with evaluation of participant outcomes, can suggest hypotheses regarding factors that would lead to program success (positive participant outcomes) if the program is continued or replicated.
Details regarding the data collection process that are gathered during the formative assessment will help assess the quality and limitations of the participant outcome data, and the reliability of any conclusions based on that data.
So, for us this matrix approach is a quality-check on our evaluation design that also helps during implementation. Maybe you will find it helpful, too.