Lyssa Becho

Research Associate, Western Michigan University

Lyssa Wilson contributes to various EvaluATE projects including the ATE annual survey report, survey snapshots, conference presentations, and blog posts. She is a student in the Interdisciplinary Ph.D. in Evaluation program at Western Michigan University. She has worked on a number of different evaluations, both big and small. Her interests lie in improving the way we conduct evaluation through research on evaluation methods and theories, as well as creating useful and understandable evaluation reports through data visualization.


Blog: One Pagers: Simple and Engaging Reporting

Posted on December 20, 2017 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
 
Lyssa Wilson Becho
Research Associate, EvaluATE
Emma Perk
Co-Principal Investigator, EvaluATE

Traditional, long-form reports are often used to detail the depth and specifics of an evaluation. However, many readers simply don’t have the time or bandwidth to digest a 30-page report. Distilling the key information into one page can help catch the eye of busy program staff, college administrators, or policy makers.

When we say “one pager,” we mean a single-page document that summarizes evaluation data, findings, or recommendations. It’s generally a stand-alone document that supplements a longer report, dataset, or presentation.

One pagers are a great way to get your client the data they need to guide data-driven decisions. These summaries can work well as companion documents for long reports or as a highlight piece for an interim report. We created a 10-step process to help facilitate the creation of a one pager. Additional materials are available, including detailed slides, grid layouts, videos, and more.

Ten-step process for creating a one pager:

1. Identify the audience

Be specific about who you are talking to and their information priorities. The content and layout of the document should be tailored to meet the needs of this audience.

2. Identify the purpose

Write a purpose statement that identifies why you are creating the one pager. This will help you decide what information to include or to exclude.

3. Prioritize your information

Categorize the information most relevant to your audience. Then rank each category from highest to lowest priority to help inform layout of the document.

4. Choose a grid

Use a grid to intentionally organize elements visually for readers. Check out our free pre-made grids, which you can use for your own one pagers, and instructions on how to use them in PowerPoint (video).

5. Draft the layout

Print out your grid layout and sketch your design by hand. This will allow you to think creatively without technological barriers and will save you time.

6. Create an intentional visual path

Pay attention to how the reader’s eye moves around the page. Use elements like large numbers, ink density, and icons to guide the reader’s visual path. Keep in mind the page symmetry and need to balance visual density. For more tips, see Canva’s Design Elements and Principles.

7. Create a purposeful hierarchy

Use headings intentionally to help your readers navigate and identify the content.

8. Use white space

The brain subconsciously views content grouped together as a cohesive unit. Add white space to indicate that a new section is starting.

9. Get feedback

Run your designs by a colleague or client to help catch errors, note areas needing clarification, and ensure the document makes sense to others. You will likely need to go through a few rounds of feedback before the document is finalized.

10. Triple-check consistency

Triple-check, and possibly quadruple-check, for consistency of fonts, alignment, size, and colors. Style guides can be a useful way to keep track of consistency in and across documents. Take a look at EvaluATE’s style guide here.

The demand for one pagers is growing, and now you are equipped with the information you need to succeed in creating one. So, start creating your one pagers now!

Report: Annual Survey: 2017 ATE Survey Report

Posted on December 6, 2017 by , , , in Annual Survey ()

This report summarizes data gathered in the 2017 survey of ATE program grantees. Conducted by EvaluATE—the evaluation support center for the ATE program, located at The Evaluation Center at Western Michigan University—this was the 18th annual ATE survey. Included here are findings about funded projects and their activities, accomplishments, and impacts during the 2016 calendar year (2016 fiscal year for budget-related questions).

File: Click Here
Type: Report
Category: ATE Annual Survey
Author(s): Arlen Gullickson, Emma Perk, Lori Wingate, Lyssa Becho

Communication Plan Checklist for ATE Principal Investigators and Evaluators

Posted on October 17, 2017 by , in ,

Creating a clear communication plan at the beginning of an evaluation can help project personnel and evaluators avoid confusion, misunderstandings, or uncertainty. The communication plan should be an agreement between the project’s principal investigator and the evaluator, and followed by members of their respective teams. This checklist highlights the decisions that need to made when developing a clear communication plan.

File: Click Here
Type: Checklist
Category: Checklist, Evaluation Design
Author(s): Lori Wingate, Lyssa Becho

Engagement of Business and Industry in Program Evaluation: 2014

Posted on July 3, 2017 by , in Annual Survey ()

According to the results of the 2015 survey of ATE grantees, 74 percent of ATE grantees collaborated with business and industry in 2014. These 171 respondents were asked a series of questions about how they involve their industry partners in various aspects of their ATE project and center evaluations. Their responses are described below.

File: Click Here
Type: Report
Category: ATE Annual Survey
Author(s): Corey Smith, Lyssa Wilson

Blog: Designing a Purposeful Mixed Methods Evaluation

Posted on March 1, 2017 by  in Blog ()

Research Associate, Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

A mixed methods evaluation involves collecting, analyzing, and integrating data from both quantitative and qualitative sources. Sometimes, I find that while I plan evaluations with mixed methods, I do not think purposely about how or why I am choosing and ordering these methods. Intentionally planning a mixed methods design can help strengthen evaluation practices and the evaluative conclusions reached.

Here are three common mixed methods designs, each with its own purpose. Use these designs when you need to (1) see the whole picture, (2) dive deeper into your data, or (3) know what questions to ask.

1. When You Need to See the Whole Picture
First, the convergent parallel design allows evaluators to view the same aspect of a project from multiple perspectives, creating a more complete understanding. In this design, quantitative and qualitative data are collected simultaneously and then brought together in the analysis or interpretation stage.

For example, in an evaluation of a project whose goal is to attract underrepresented minorities into STEM careers, a convergent parallel design might include surveys of students asking Likert questions about their future career plans, as well as focus groups to ask questions about their career motivations and aspirations. These data collection activities would occur at the same time. The two sets of data would then come together to inform a final conclusion.

2. When You Need to Dive Deeper into Data

The explanatory sequential design uses qualitative data to further explore quantitative results. Quantitative data is collected and analyzed first. These results are then used to shape instruments and questions for the qualitative phase. Qualitative data is then collected and analyzed in a second phase.

For example, instead of conducting both a survey and focus groups at the same time, the survey would be conducted and results analyzed before the focus group protocol is created. The focus group questions can be designed to enrich understanding of the quantitative results. For example, while the quantitative data might be able to tell evaluators how many Hispanic students are interested in pursuing engineering, the qualitative could follow up on students’ motivations behind these responses.

3. When You Need to Know What to Ask

The exploratory sequential design allows an evaluator to investigate a situation more closely before building a measurement tool, giving guidance to what questions to ask, what variables to track, or what outcomes to measure. It begins with qualitative data collection and analysis to investigate unknown aspects of a project. These results are then used to inform quantitative data collection.

If an exploratory sequential design was used to evaluate our hypothetical project, focus groups would first be conducted to explore themes in students’ thinking about STEM careers. After analysis of this data, conclusions would be used to construct a quantitative instrument to measure the prevalence of these discovered themes in the larger student body. The focus group data could also be used to create more meaningful and direct survey questions or response sets.

Intentionally choosing a design that matches the purpose of your evaluation will help strengthen evaluative conclusions. Studying different designs can also generate ideas of different ways to approach different evaluations.

For further information on these designs and more about mixed methods in evaluation, check out these resources:

Creswell, J. W. (2013). What is Mixed Methods Research? (video)

Frechtling, J., and Sharp, L. (Eds.). (1997). User-Friendly Handbook for Mixed Method Evaluations. National Science Foundation.

Watkins, D., & Gioia, D. (2015). Mixed methods research. Pocket guides to social work research methods series. New York, NY: Oxford University Press.

2016 High Impact Technology Exchange Conference (HI-TEC)

Posted on July 15, 2016 by , in Conferences ()

2016 High Impact Technology Exchange Conference (HI-TEC)
Pittsburgh, PA
July 25-28, 2016

Workshop

Logic Models: The Swiss Army Knife of Project Planning and Evaluation
Kelly Robertson
Lyssa Wilson

July 27, 2016 | 3:45-4:30 p.m.

A logic model is a graphic depiction of how a project translates its resources and activities into outcomes. Logic models are useful tools for succinctly communicating a project’s goals and activities, but they have many other applications. They provide a foundation for a project evaluation plan (and subsequent reporting) and can be used to organize the content of a grant proposal.  In this session, participants will learn the basics of how to create a logic model and we will demonstrate its use for planning a project evaluation and organizing a grant proposal.  Participants will receive the Evaluation Planning Checklist for ATE Proposals and ATE Project Logic Model Template.

Participants will receive the Evaluation Planning Checklist for ATE Proposals and ATE Project Logic Model Template.

For more information about the conference, and for conference registration, please visit http://www.highimpact-tec.org/

Resources:
Slides
Handout