Archive: evaluation quality

Blog: Five Questions All Evaluators Should Ask Their Clients

Posted on July 8, 2015 by  in Blog ()

Senior Research Analyst, Hezel Associates

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

One of the things that I love about program evaluation is the diversity of models and methods that you must think about to analyze a program. But even before you get to the point of developing and solidifying your evaluation design, there’s a lot of legwork you need to do up front. In successful evaluations, that process starts by asking the right questions. So where does this process start? Here are just a few questions you can start with to get a conversation rolling with your client and have confidence that your evaluation is moving in the right direction.

1. What do you hope to achieve with this program?

A common challenge for all organizations is goal setting, and in an evaluation setting, having clear and measurable goals is absolutely essential. Too often goals are defined, but may not actually be matched to participant or organizational needs. As evaluators, we should pay close attention to these distinctions, as they enable us to help clients improve the implementation of their programs and guide them towards their anticipated outcomes.

2. What’s the history of this program?

New program or old, you’re going to need to know the background of the initiative. That will lead you to understand the funding, core stakeholders, requirements, and any necessary information needed to evaluate the program. You might learn interesting stories about why the program has struggled, which can help you to design your evaluation and create research questions. It’s also a great way to get to know a client and learn about their pain points in the past and really understand what their objectives are for the evaluation.

3. What kind of data do you plan on collecting or do you have access to?

Every program evaluator has faced the challenge of getting the data they need to conduct an evaluation. You need to know what’s needed early on and what kind of data you’ll need to do the evaluation. Don’t wait to have those conversations with your clients. If you’re putting this on hold until you are ready to conduct your tests, it may very well be too late.

4. What challenges do you foresee with program implementation?

Program designs might change as challenges that impact program design and delivery arise. But if you can spot some red flags early on, you might be able to help your client navigate implementation challenges and avoid roadblocks. The key is being flexible and working with your client to understand and anticipate implementation issues and work to address them in advance.

5. What excites you about this program?

This question allows you to get to know the client a bit more, understand their interests, and build a relationship with the client. I love this question because it reinforces the idea of an evaluator as a partner in the program. By acting as a partner, you can provide your clients with the right kind of evaluation, and build a partnership along the way.

Program evaluation presents some very challenging and complex questions for evaluators. Starting with these five questions will allow you to focus the evaluation and set your client and the evaluation team up for success.

 

 

Newsletter: Evaluation that Seriously Gets to the Point- and Conveys it Brilliantly

Posted on April 1, 2013 by  in Newsletter - ()

Evaluation, much as we love it, has a reputation among nonevaluators for being overly technical and academic, lost in the details, hard work to wade through, and in the end, not particularly useful. Why is this? Many evaluators were originally trained in the social sciences. There we added numerous useful frameworks and methodologies into our toolkits. But, along the way, we were inculcated with several approaches, habits, and ways of communicating that are absolutely killing our ability to deliver the value we could be adding. Here are the worst of them:

  1. Writing question laundry lists – asking long lists of evaluation questions that are far too narrow and detailed (often at the indicator level)
  2. Leaping to measurement – diving into identifying intended outcomes and designing data collection instruments without a clear sense of who or what the evaluation is for
  3. Going SMART but unintelligent – focusing on what’s most easily measurable rather than making intelligent choices to go after what’s most important (SMART = specific, measurable, achievable, relevant, and time-based)
  4. Rorschach inkblotting – assuming that measures, metrics, indicators, and stories are the answers; they are not!
  5. Shirking valuing – treating evaluation as an opinion-gathering exercise rather than actually taking responsibility for drawing evaluative conclusions based on needs, aspirations, and other relevant values
  6. Getting lost in the details – leaving the reader wading through data instead of clearly and succinctly delivering the answers they need
  7. Burying the lead – losing the most important key messages by loading way too many “key points” into the executive summaries, not to mention the report itself, or using truly awful data visualization techniques
  8. Speaking in tongues – using academic and technical language that just makes no sense to normal people

Thankfully, hope is at hand! Breakthrough thinking and approaches are all around us, but many evaluators just aren’t aware of them . Some have been there for decades. Here’s a challenge for 2013. Seek out and get really serious about infusing the following into your evaluation work:

  • Evaluation-Specific Methodology (ESM) – the methodologies that are distinctive to evaluation, i.e., the ones that go directly after values. Examples include needs and values assessment; merit determination methodologies; importance weighting methodologies; evaluative synthesis methodologies; and value-for-money analysis
  • Actionable Evaluation – a pragmatic, utilization-focused framework for evaluation that asks high-level explicitly evaluative questions, and delivers direct answers to them using ESM
  • Data Visualization & Effective Reporting – the best of the best of dataviz, reporting, and communication to deliver insights that are not just understandable but unforgettable