Patrick Fiorenza

Senior Research Analyst, Hezel Associates

As a senior research analyst with Hezel Associates, Patrick Fiorenza contributes his extensive experience in qualitative and quantitative data collection and analysis, which he gained from his work in both the private and public sectors. Examples of his current projects include two National Science Foundation-funded science, technology, engineering, and mathematics (STEM) evaluations for Syracuse University and Southern Illinois University Edwardsville, as well as a U.S. Department of Labor Trade Adjustment Assistance Community College and Career Training (TAACCCT) grant-funded program evaluation with Onondaga Community College. Mr. Fiorenza holds a master’s in public administration from Syracuse University.


Tips and Tricks When Writing Interview Questions

Posted on January 2, 2018 by  in Blog ()

Senior Research Analyst, Hezel Associates

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Developing a well-constructed interview protocol is by no means an easy task. To give us some ideas on how to formulate well-designed interview questions, Michael Patton (2015) dedicates an entire chapter of his book, Qualitative Research & Evaluation Methods, to question formulation. Like with any skill, the key to improving your craft is practice. That’s why I wanted to share a few ideas from Patton and contribute some of my own thoughts to help improve how you formulate interview questions.

One approach I find useful is to consider the category of question you are asking. With qualitative research, the categories of questions can sometimes seem infinite. However, Patton provides a few overarching categories, which can help frame your thinking, allowing you to ask questions with more precision and be intentional with what you are asking. Patton (2015, p. 444) suggests general categories and provides a few question examples, which are presented below. So, when trying to formulate a question, consider the type you are interested in asking:

  • Experience and behavior questions: If I had been in the program with you, what would I have seen you doing?
  • Opinion and value questions: What would you like to see happen?
  • Feeling questions: How do you feel about that?
  • Knowledge questions: Who is eligible for this program?
  • Sensory questions: What does the counselor ask you when you meet with her? What does she actually say? (Questions that describe stimuli)
  • Background and demographic questions: How old are you?

Once the category is known and you start writing or editing questions, some additional strategies are to double check that you are writing truly open-ended questions and avoiding jargon. For instance, don’t assume that your interviewee knows the acronyms you’re using. As evaluators, sometimes we know the program better than the informants! This makes it so important to write questions with clarity. Everyone wins when you take the time to be intentional and design a question with clarity—you get better data and you won’t confuse your interviewee.

Another interesting point from Patton is to make sure you are asking a singular question. Think about when you’re conducting quantitative research and writing an item for a questionnaire—a red flag might be if it’s double-barreled (i.e., asking more than one question simultaneously). For example, a poorly framed questionnaire item about experiences in a mentorship program might read: To what extent do you agree with the statement, “I enjoyed this program and would do it again.” You simply wouldn’t put that item in a questionnaire, since a person might enjoy the program, but wouldn’t necessarily do it again. Although you have more latitude during an interview, it’s always best to write your questions with precision. It’s also a good chance for you to flex some skills when conducting the interview, knowing when to probe effectively if you need to shift the conversation or dive deeper based on what you hear.

It is important to keep in mind there is no right way to formulate interview questions. However, by having multiple tools in your tool kit, you can lean on different strategies as appropriate, allowing you to develop stronger and more rigorous qualitative studies.

 

Reference:

Patton, M. Q. (2015). Qualitative research & evaluation methods: Integrating theory and practice. Thousand Oaks, CA: SAGE.

Blog: Five Questions All Evaluators Should Ask Their Clients

Posted on July 8, 2015 by  in Blog ()

Senior Research Analyst, Hezel Associates

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

One of the things that I love about program evaluation is the diversity of models and methods that you must think about to analyze a program. But even before you get to the point of developing and solidifying your evaluation design, there’s a lot of legwork you need to do up front. In successful evaluations, that process starts by asking the right questions. So where does this process start? Here are just a few questions you can start with to get a conversation rolling with your client and have confidence that your evaluation is moving in the right direction.

1. What do you hope to achieve with this program?

A common challenge for all organizations is goal setting, and in an evaluation setting, having clear and measurable goals is absolutely essential. Too often goals are defined, but may not actually be matched to participant or organizational needs. As evaluators, we should pay close attention to these distinctions, as they enable us to help clients improve the implementation of their programs and guide them towards their anticipated outcomes.

2. What’s the history of this program?

New program or old, you’re going to need to know the background of the initiative. That will lead you to understand the funding, core stakeholders, requirements, and any necessary information needed to evaluate the program. You might learn interesting stories about why the program has struggled, which can help you to design your evaluation and create research questions. It’s also a great way to get to know a client and learn about their pain points in the past and really understand what their objectives are for the evaluation.

3. What kind of data do you plan on collecting or do you have access to?

Every program evaluator has faced the challenge of getting the data they need to conduct an evaluation. You need to know what’s needed early on and what kind of data you’ll need to do the evaluation. Don’t wait to have those conversations with your clients. If you’re putting this on hold until you are ready to conduct your tests, it may very well be too late.

4. What challenges do you foresee with program implementation?

Program designs might change as challenges that impact program design and delivery arise. But if you can spot some red flags early on, you might be able to help your client navigate implementation challenges and avoid roadblocks. The key is being flexible and working with your client to understand and anticipate implementation issues and work to address them in advance.

5. What excites you about this program?

This question allows you to get to know the client a bit more, understand their interests, and build a relationship with the client. I love this question because it reinforces the idea of an evaluator as a partner in the program. By acting as a partner, you can provide your clients with the right kind of evaluation, and build a partnership along the way.

Program evaluation presents some very challenging and complex questions for evaluators. Starting with these five questions will allow you to focus the evaluation and set your client and the evaluation team up for success.