Archive: evaluation capacity building

Blog: Building Capacity for High-Quality Data Collection

Posted on May 13, 2020 by  in Blog (, )

Director of Evaluation, Thomas P. Miller & Associates, LLC 

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

As I, like everyone else, am adjusting to working at home and practicing social distancing, I have been thinking about how to conduct my evaluation projects remotely. One thing that’s struck me as I’ve been retooling evaluation plans and data collection timelines is the need for even more evaluation capacity building around high-quality data collection for our clients. We will continue to rely on our clients to collect program data, and now that they’re working remotely too, a refresher on how to collect data well feels timely.  

Below are some tips and tricks for increasing your clients capacity to collect their own high-quality data for use in evaluation and informed decision making. 

Identify who will need to collect the data.  

Especially with multiple-site programs or programs with multiple collectors, identifying who will be responsible for data collection and ensuring that all data collectors use the same tools is key to collecting similar data across the program.  

Determine what is going to be collected.  

Examine your tool. Consider the length of the tool, the types of data being requested, and the language used in the tool itself. When creating a tool that will be used by others, be certain that your tool will yield the data that you need and will make sense to those who will be using it. Test the tool with a small group of your data collectors, if possible, before full deployment.  

Make sure data collectors know why the data is being collected.  

When those collecting data understand how the data will be used, they’re more likely to be invested in the process and more likely to collect and report their data carefully. When you emphasize the crucial role that stakeholders play in collecting data, they see the value in the time they are spending using your tools. 

Train data collectors on how to use your data collection tools.  

Walking data collectors through the step-by-step process of using your data collection tool, even if the tool is a basic intake form, will ensure that all collectors use the tool in the same way. It will also ensure they have had a chance to walk through the best way to use the tool before they actually need to implement it. Provide written instructions, too, so that they can refer to them in the future.  

Determine an appropriate schedule for when data will be reported.  

To ensure that your data reporting schedule is not overly burdensome, consider the time commitment that the data collection may entail, as well as what else the collectors have on their plates.  

 Conduct regular quality checks of what data is collected.  

Checking the data regularly allows you to employ a quality control process and promptly identify when data collectors are having issues. Catching these errors quickly will allow for easier course correction.  

Resource: Evaluation Process

Posted on March 14, 2018 by , in Resources ()

Highlights the four main steps of an ATE Evaluation, and provides detailed activities for each step. This example is an excerpt from the Evaluation Basics for Non-evaluators webinar. Access slides, recording, handout, and additional resources from

File: Click Here
Type: Doc
Category: Getting Started
Author(s): Emma Leeburg, Lori Wingate

Blog: Engaging Faculty in Evaluative Inquiry

Posted on September 2, 2015 by  in Blog ()

Executive Director, InSites

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

As the sun is rising here in the beautiful Pacific Northwest, I want to draw your attention to an evaluation capacity building approach called CLIPs, which you may want to incorporate into an ATE evaluation. My colleagues and I developed the process through an NSF grant (grant #0335581), which has been used now in several community colleges.

We developed Communities of Learning, Inquiry, and Practice (CLIPs) at Bakersfield College in California. CLIPs are self-determined groups of faculty and staff who learn together about their professional practice by gathering and analyzing data about a topic of importance to them. For example, one CLIP investigated the question, “Do peer study groups enhance student learning?” Another CLIP asked, “What assessment methods are most effective in computer studies courses?” Yet another explored “Are students who take developmental education courses successful in subsequent courses?” You might be asking similar questions in an ATE evaluation.

Each CLIP consisted of three to seven faculty and staff with one person as the group facilitator. We set it up so CLIP members learned a basic evaluative inquiry process with three steps: (1) design the inquiry, (2) collect data, and (3) make meaning and shape practice.

Within a given CLIP, the members simultaneously answered important evaluative questions and built their capacity to collaboratively address issues about their work on an ongoing basis. This set the stage for continual renewal of their teaching practices and ongoing inquiry about instructional processes and student learning and success.

Click here for an overview video and modules about the CLIP process. They are available to you on our InSites website.

The seven modules feature video vignettes in which CLIP team facilitators and members share their CLIP experiences and observations. The modules include downloadable resources to support individual and collaborative inquiry. These include examples of CLIP documents and in-depth reference materials. Many of these resources may be useful to you in any evaluation work you are doing, whether or not you are using the CLIP process. For example, there is a tip sheet on conducting focus groups and another on writing questionnaires.

The process has been used in several other community colleges since it was developed and is also being used in a medical school situation.

All in all, CLIPs are a great way to embed evaluation into the learning process for faculty and accomplish many of your evaluation tasks. You can also read about the theory behind CLIPs and other information in an article in the OD Practitioner. That article, Evaluative Inquiry for Complex Times, can also be accessed through our website.