Whether you’ve been analyzing and collecting evaluation data for decades or you are brand new to this work, the beginning of a new year is a good time to take stock of this aspect of your work. In this issue of EvaluATE’s newsletter, we encourage you to take a moment to reflect on your data collection practices and look for opportunities to freshen and strengthen your data.
Surveys, interviews, and focus groups are probably the most common data collection methods across ATE project evaluations. But these may not always meet your need for data or be optimal for those providing information. Photolanguage, dotmocracy, and reputational monitoring are examples of nontraditional techniques for gathering information. You’ll find an inventory of 51 data collection methods on Better Evaluation’s website. The list includes short descriptions with links to detailed guidance. It may inspire you to go beyond traditional methods and get creative and innovation with your data collection.
Developing a sound data collection instrument from scratch is time-intensive. You might be able to conserve resources by using an existing instrument that fits your context. Check out the instrument collection curated by the STEM Learning and Research Center (STELAR). STELAR supports the National Science Foundation’s Innovative Technology Experiences for Students and Teachers program, and several of the instruments are relevant to the ATE context. Examples include the STEM Semantics Survey, STEM Career Interest Questionnaire, Pre-College Annual Self-Efficacy Survey, Grit Scale, and 21st Century Skills Assessment.
In the midst of data collection and analysis, it’s easy to lose sight of the big picture – why you collected the data in the first place. Use EvaluATE’s Data Collection Planning Matrix to align your data with your evaluation questions. This template also prompts you to record your plan for analyzing and interpreting data in ways that will help you answer your evaluation questions.
Regardless of how you obtain your data or what you plan to do with it, it’s essential you take care to ensure it’s clean before you begin analysis. A systematic process of data cleaning involves identifying and correcting any issues related to data entry mistakes, duplicate records, format inconsistencies, and other problems that detract from the accuracy of the information or impair your ability to make sense of it. Check out Aleata Hubbard’s Six Data Cleaning Checks for guidance on how to make sure your data are ready for analysis.
Meet EvaluATE’s friendly new ATE survey coordinator
Check out this short video to meet Lyssa Wilson Becho, your one-stop shop for questions about the annual survey of Advanced Technological Education (ATE) principal investigators, and to get the scoop on this year’s survey. In this video, Lyssa gives a quick introduction to the 2018 survey and how the findings are used throughout the ATE community.
Check out the new ATE evaluation report repository
EvaluATE is building a repository of ATE evaluation reports. Check it out to get a sense of how ATE projects and centers are evaluating their work and what they’re learning. If you’re an ATE principal and investigator or evaluator, let us know if you have a report that should be added to the collection.