Archive: data interpretation

Blog: Evaluating Impact: How I Moved From Pipeline to Interstate

Posted on December 10, 2014 by  in Blog ()

CKO of Sener Knowledge, LLC

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

John Sener of Sener Knowledge LLC, an external evaluator for over a dozen ATE and other NSF grants over the past eight years, discusses how to move beyond the “pipeline” metaphor for describing technician education and adopt the more useful “interstate” metaphor instead.

Evaluate an NSF ATE center or project long enough, and eventually you’ll hear the word pipeline—as in, “closing gaps in the education pipeline,” or “increasing the cybersecurity workforce pipeline.” The pipeline metaphor describes the ATE technician education process pretty well to some extent:

A) Students proceed through an NSF-funded ATE program “pipeline” in well-defined cohorts on a relatively uniform timetable.

B) The pipeline feeds program graduates to employers, often in a relatively limited number of local or regional employers in the related field.

C) Evaluators aim to document that B is the result of A.

However, I have long been dissatisfied with the pipeline metaphor because it traps its users into limited thinking, making it easy to overlook important sources of project impact. As noted in my book, The Seven Futures of American Education, many students exhibit characteristics that reflect a greater degree of choice than the pipeline metaphor implies: “stopouts” who drop in and out of school, “swirlers” who attend multiple institutions, “stay-longers” who exceed the prescribed time period for program completion. I’ve found that an interstate highway is a much more useful metaphor to understand learners who:

  • Choose among multiple entry and exit points rather than following a prescribed path;
  • Travel at different speeds and on different schedules through their education programs;
  • Sometimes seek alternate routes and multiple destinations during their educational journeys.

Here are some ways I use the interstate metaphor to find sources of project impact.

Pipelines move their cargo from Point A to Point B, while interstates support two-way traffic; look in both directions for indications of project impact, for instance:

  • Career changers with bachelor’s or graduate degrees returning to community colleges for additional training or certification
  • Four-year students mentoring two-year students to prepare for student competitions
  • Community college students mentoring high school students or serving as judges for high school student competitions
  • Business and industry practitioners getting involved on community college curriculum advisory boards or forming three-way partnerships with faculty and students to enhance the learning/assessment experience or create knowledge collaboratively

Expand the realm of acceptable outcomes. ATE projects have significant impact beyond program completion or employment placement. Students sometimes find jobs before completing a program. Alternatively, intermediate outcomes—such as certificate or multiple course completion—may also indicate progress, especially for students returning to college to enhance their existing professional prospects.

Look in other places for impact. Extracurricular activities, such as student competitions, clubs, or organizations, are one good place to look. I sometimes think of such places as “toll booths”—activities where impact can be measured more easily as students pass through them.

Blog: This is How “We” Do Data: Collect, Unravel, Summarize!

Posted on November 19, 2014 by  in Blog ()

Executive Director, FLATE – Florida Advanced Technological Education Center of Excellence

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

At this time of year (early spring), we at FLATE stop to review, tally, analyze, and aggregate our collected data from the previous calendar year. We look for new comparative data sets from other organizations to see how we are doing in specific activities, or groups of activities related to FLATE’s outreach, curriculum reform and professional development efforts. We look for ways to refine our data collection by identifying unclear survey questions or gaps in information we retrieve from various surveys. Here’s how we do it and how it all comes together:

We start with a good review and scrub off raw data that has been collected and recorded. Although we regularly review data and trends during the year to prepare for our year-end summaries, we take a final look to be sure all the data we use is “good data.” Once this somewhat tedious stage is complete, the treasure hunt for “hidden data nuggets” begins.

Many times we are just looking for summary data that we have to report and trends, hopefully positive trends, in the impact of our activities, resources, and events. Beyond just reporting, this kind of information tells us if we are still reaching the same kinds of participants and the same numbers of stakeholders for different kinds of events, resources, and activities. Reviewing this information carefully helps us target specific “missed” audiences in the coming year at various events.

After we complete reviewing, cleaning, organizing, and summarizing our annual reporting data, we continue to probe and assess what else we can learn from it. In various stages of data review and summarizing, we often find ourselves asking: “I wonder if…?”; “how could I know if this is connected to that?” or “wouldn’t it be great if we also could know…?” These are questions we revisit by bringing data together from different sources and looking at the data from different perspectives. The exercise becomes a game of puzzling together different results, trying to reveal more impact.

We move from the observations “wows,” “oh my’s,” and “ah-ha’s!” to filtering which of the ideas or questions will give us the most “bang for our buck,” as well as help us better answer the questions that NSF and our valued stakeholders ask. The cycle of continuous improvement underlies this whole process. How can we do what we do better by being more strategic in our activities, events, and resources? Can we ask better survey questions that will reveal more and better information? Can we totally change our approach to get more impactful data?

Here is an example using our websites and blogs data: FLATE collects monthly data using Google Analytics for its websites and newsletter and reviews this information quarterly with staff and the leadership team. The objective is to compare website and newsletter activity and growth (as measured by visits) against benchmarks and evaluate usage trends. The process of collecting and taking a look at this information led to a further question and action item: In addition to increasing use of these FLATE products, we now wish to increase the percentage of returning visitors, providing evidence that our community of practice not only uses, but relies on FLATE as an information resource.