At this time of year (early spring), we at FLATE stop to review, tally, analyze, and aggregate our collected data from the previous calendar year. We look for new comparative data sets from other organizations to see how we are doing in specific activities, or groups of activities related to FLATE’s outreach, curriculum reform and professional development efforts. We look for ways to refine our data collection by identifying unclear survey questions or gaps in information we retrieve from various surveys. Here’s how we do it and how it all comes together:

We start with a good review and scrub off raw data that has been collected and recorded. Although we regularly review data and trends during the year to prepare for our year-end summaries, we take a final look to be sure all the data we use is “good data.” Once this somewhat tedious stage is complete, the treasure hunt for “hidden data nuggets” begins.

Many times we are just looking for summary data that we have to report and trends, hopefully positive trends, in the impact of our activities, resources, and events. Beyond just reporting, this kind of information tells us if we are still reaching the same kinds of participants and the same numbers of stakeholders for different kinds of events, resources, and activities. Reviewing this information carefully helps us target specific “missed” audiences in the coming year at various events.

After we complete reviewing, cleaning, organizing, and summarizing our annual reporting data, we continue to probe and assess what else we can learn from it. In various stages of data review and summarizing, we often find ourselves asking: “I wonder if…?”; “how could I know if this is connected to that?” or “wouldn’t it be great if we also could know…?” These are questions we revisit by bringing data together from different sources and looking at the data from different perspectives. The exercise becomes a game of puzzling together different results, trying to reveal more impact.

We move from the observations “wows,” “oh my’s,” and “ah-ha’s!” to filtering which of the ideas or questions will give us the most “bang for our buck,” as well as help us better answer the questions that NSF and our valued stakeholders ask. The cycle of continuous improvement underlies this whole process. How can we do what we do better by being more strategic in our activities, events, and resources? Can we ask better survey questions that will reveal more and better information? Can we totally change our approach to get more impactful data?

Here is an example using our websites and blogs data: FLATE collects monthly data using Google Analytics for its websites and newsletter and reviews this information quarterly with staff and the leadership team. The objective is to compare website and newsletter activity and growth (as measured by visits) against benchmarks and evaluate usage trends. The process of collecting and taking a look at this information led to a further question and action item: In addition to increasing use of these FLATE products, we now wish to increase the percentage of returning visitors, providing evidence that our community of practice not only uses, but relies on FLATE as an information resource.

About the Authors

Marilyn Barger

Marilyn Barger box with arrow

Executive Director, FLATE - Florida Advanced Technological Education Center of Excellence

Dr. Marilyn Barger is the Executive Director and Principle Investigator for the Florida Advanced Technological Education Center of Excellence (FL-ATE). This center focuses on high-technology manufacturing. Dr. Barger is also on EvaluATE's Community College Liason Panel.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.