Lyssa Wilson Becho

Research Associate, Western Michigan University

Lyssa leads the training elements of EvaluATE, including webinars, workshops, resources, and evaluation coaching. She also works with Valerie on strategy and reporting for the ATE annual survey. Lyssa is a senior research associate at The Evaluation Center at Western Michigan University and co-principal investigator for EvaluATE. She holds a Ph.D. in evaluation and has 7 years of experience conducting evaluations for a variety of local, national, and international programs.


Webinar: Making the Most of Your Evaluation: How to Use Evaluation Findings to Benefit Your Project

Posted on November 9, 2020 by , in Webinars

Presenter(s): Emma Leeburg, Lyssa Wilson Becho
Date(s): December 2, 2020
Time: 1p.m.- 2p.m. Eastern

Webinar Title: Making the Most of Your Evaluation: How to Use Evaluation Findings to Benefit Your Project

Join this webinar to learn how evaluation findings can be put to use for the benefit of ATE projects. We will address how evaluations can help projects adjust in uncertain times, how to integrate evaluation findings into your annual report to NSF, and how evaluation can help you think more intentionally about your long-term goals. Additionally, we will share real-world examples of use and utility of evaluation. Whether you’re an evaluator, PI, project staff, or grants professional, you’ll leave this webinar with new ideas for how to translate evaluation results into tangible benefits for ATE projects.

Resources:

Use it or Lose it: How to Get the Most out of your Project Evaluation

Posted on October 16, 2020 by  in Conferences

ATE PI Conference 2020

Monday, October 19 | 1-3:00 p.m. | Virtual

Join this workshop to learn how evaluation can be put to use for the benefit of ATE projects. We will address how evaluations can help projects adjust in uncertain times, how to integrate evaluation findings into your annual report to NSF, and how evaluation can help you think more intentionally about your long-term goals. Additionally, we will share real-world examples of the use and utility of evaluation. This interactive, virtual workshop will include group discussions, question and answer sessions, as well as small group activities. Whether you’re an evaluator, PI, project staff, or grants professional, you’ll leave this workshop with new ideas for how to translate evaluation results into tangible benefits for ATE projects.

Download Resources

Handout

Checklist for Using Evaluative Findings

Slides

Other Relevant Resources

Three Questions to Spur Actions from Your Evaluation Report (blog)

Beyond Reporting: Getting More Value Out of Your Evaluation (blog)

How Can You Make Sure Your Evaluation Meets the Needs of Multiple Stakeholders (blog)

Tips for Evaluation Recommendations (blog)

Getting Ready to Reapply – Highlighting Results of Prior Support (blog)

How is an NSF Project Outcomes Report Different from a Final Annual Report (blog)

 

View Other ATE PI Conference Activities

Blog: Making the Most of Virtual Conferences: An Exercise in Evaluative Thinking

Posted on September 2, 2020 by  in Blog ()

Research Associate, Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

We at EvaluATE affectionately call the fall “conference season.” Both the ATE PI Conference and the American Evaluation Association’s annual conference usually take place between October and November every year. This year, both conferences will be virtual events. Planning how our project will engage in this new virtual venue got me thinking: What makes a virtual conference successful for attendees? What would make a virtual conference successful for me?

I started by considering what makes an in-person conference successful, and I quickly realized that this was an exercise in evaluative thinking. The concept of evaluative thinking has been defined in a variety of ways—as a “type of reflective practice” (Baker & Bruner, 2012, p. 1), a combination of “critical thinking, creative thinking, inferential thinking, and practical thinking” (Patton, 2018, p. 21), and a “problem-solving approach” (Vo, 2013, p. 105). In this case, I challenged myself to consider what my personal evaluation criteria would be for a successful conference and what my ideal outcomes would look like.

In my reflection process, I came up with a list of key outcomes for attending a conference. Specifically, at conferences, I hope to:

  • build new relationships with peers;
  • grow relationships with existing partners;
  • learn about new trends in research and practice;
  • learn about future research opportunities (places I might be able to fill in the gaps); and
  • feel part of a community and re-energized about my work.

I realized that many of these outcomes are typically achieved through happenstance. For example, at previous conferences, most of my new relationships with peers occurred because of a hallway chat or because I sat next to someone in a session and we struck up a conversation and exchanged information. It’s unlikely these situations would occur organically in a virtual conference setting. I would need to be intentional about how I participated in a virtual conference to achieve the same outcomes.

I began to work backwards to determine what actions I could take to ensure I achieved these outcomes in a virtual conference format. In true evaluator fashion, I constructed a logic model for my virtual conference experience (shown in Figure 1). I realized I needed to identify specific activities—agreements with myself—to get the most out of the experience and have a successful virtual conference.

For example, one of my favorite parts of a conference is feeling like I am part of a larger community and becoming re-energized about my work. Being at home, it can be easy to become distracted and not fully engage with the virtual platform, potentially threatening these important outcomes. To address this, I have committed to blocking off time on my schedule during both conferences to authentically engage with the content and attendees.

How do you define a successful conference? What outcomes do you want to achieve in upcoming conferences that have gone virtual? While you don’t have to make a logic model out of your thoughts, I would challenge you to think evaluatively about upcoming conferences, asking yourself what you hope to achieve and how can you ensure that it happens.

Figure 1. Lyssa’s Logic Model to Achieve a Successful Virtual Conference

Figure 1. Lyssa’s Logic Model to Achieve a Successful Virtual Conference

Webinar: How to Avoid Common Pitfalls When Writing Evaluation Plans for ATE Proposals

Posted on July 28, 2020 by , , in Webinars ()

Presenter(s): Anastasia Councell, Emma Leeburg, Lyssa Wilson Becho
Date(s): August 19, 2020
Time: 1 p.m. – 2 p.m. Eastern
Recording: https://youtu.be/LTMShY2tM0o

Join this webinar to learn what pitfalls to watch out for when writing evaluation plans for grant proposals! In this webinar, we will share some of the biggest mistakes made in evaluation plans for ATE proposals and how to fix them. This webinar will go beyond EvaluATE’s current checklist for writing evaluation plans to highlight the good and the bad from real-world examples. Grant writers, project staff, and evaluators are encouraged to attend! Those completely new to grant writing may want to review the basic elements of an evaluation plan in our short video series prior to attending this webinar.

Resources:
Slides
Toolkit for Writing Evaluation Plans for ATE Proposals
Blog: Kirkpatrick Model for ATE Evaluation
Blog: Three Questions to Spur Action from Your Evaluation Report
Video Series: Evaluation: The Secret Sauce

Webinar: Adapting Evaluations in the Era of Social Distancing

Posted on April 27, 2020 by , , in Webinars

Presenter(s): Anastasia Councell, Lyssa Wilson Becho, Michael Lesiecki
Date(s): May 27, 2020
Time: 1:00 p.m. – 2:00 p.m. Eastern
Recording: https://youtu.be/Ylo9p111Mcc

As we continue to social distance to keep our communities safe, evaluators and project stakeholders must think about and conduct evaluations in new ways. In this webinar, we will share 10 strategies for adapting to this new evaluation reality. These strategies will help participants rethink evaluation plans amidst project changes and disruptions, engage stakeholders virtually, and adapt to remote data collection. Participants will have a chance to hear from other evaluators and share their own successes and struggles with adjusting evaluation practices in the era of social distancing. This webinar will provide practical tools to apply to evaluation work during this time of uncertainty and change. 

Resources:
Slides
Chat Transcript
Handout
Additional Resources

Checklist: Communication Plan for ATE Principal Investigators and Evaluators

Posted on March 31, 2020 by , in Checklist ()

Creating a clear communication plan at the beginning of an evaluation can help project personnel and evaluators avoid confusion, misunderstandings, or uncertainty. The communication plan should be an agreement between the project’s principal investigator and the evaluator, and followed by members of their respective teams. This checklist highlights the decisions that need to made when developing a clear communication plan.

  • Designate one primary contact person from the project staff and one from the evaluation team. Clearly identify who should be contacted regarding questions, changes, or general updates about the evaluation. The project staff person should be someone who has authority to make decisions or approve small changes that might occur during the evaluation, such as the principal investigator or project manager.
  • Set up recurring meetings to discuss evaluation matters. Decide on the meeting frequency and platform for the project staff and evaluation team to discuss updates on the evaluation. These regular meetings should occur throughout the life of a project.
    • Frequency — At minimum, plan to meet monthly. Increase the frequency as needed to maintain momentum and meet key deadlines.
    • Platform — Real-time interaction via phone calls, web meetings, or in-person meetings will help ensure those involved give adequate attention to the matters being discussed. Do not rely on email or other asynchronous communication platforms.
    • Agenda — Tailor the agendas to reflect the aspects of the evaluation that need attention. In general, the evaluator should provide a status update, identify challenges, and explain what the project staff can do to facilitate the evaluation. The project staff should share important changes or challenges in the project, such as delays in timelines or project staff turnover. Conversations should close with clear action items and deadlines.
  • Agree on a process for reviewing and finalizing data collection instruments and procedures, and evaluation reports. Determine the project staff’s role in providing input on instruments (such as questionnaires or interview protocols), the mechanisms by which data will be collected, and reports. Establish a turnaround time for feedback, to avoid delays in implementing the evaluation.
  • Clarify who is responsible for disseminating reports. As a rule of thumb, responsibility and authority for the distribution of evaluation report lies with the project’s principal investigator. Make it clear whether the evaluator may use the reports for their own purposes and under what conditions.

Downloads

Communication Checklist (PDF)

 

Webinar: Impact Evaluation: Why, What, and How

Posted on October 31, 2019 by , in Webinars

Presenter(s): Lyssa Wilson Becho, Michael Lesiecki
Date(s): December 11, 2019
Time: 1:00 pm – 2:00 pm EST
Recording: https://youtu.be/mRSoGtHQa7Q

Impact evaluation can be a powerful way to assess the long-term or broader effects of a project. Attention to causal inference, which attributes change to the project and its activities, sets impact evaluation apart from other types of evaluation. Impact evaluation can support deeper learning and direction for project scaling and future sustainability.

This webinar is an introduction to impact evaluation and how it can be realistically implemented in ATE projects. ATE principal investigators, project and center staff, and evaluators who attend this webinar will learn:
(1) the basic tenets of impact evaluation,
(2) strategies for determining causal attribution, and
(3) the resources needed to implement impact evaluation for your project

Further Reading:
Impact Evaluation Video Series by UNICEF
Understanding Causes of Outcomes and Impacts by Better Evaluation
Strategies for Causal Attribution by Patricia Rogers and UNICEF
Establishing Cause and Effect by Web Center for Social Research Methods

Resources:
Slides
Three questions to determine causality handout

Webinar: Evaluation: The Secret Sauce in Your ATE Proposal

Posted on July 3, 2019 by , , in Webinars

Presenter(s): Emma Perk, Lyssa Wilson Becho, Michael Lesiecki
Date(s): August 21, 2019
Time: 1:00pm-2:30pm Eastern
Recording: https://youtu.be/XZCfd7m6eNA

Planning to submit a proposal to the National Science Foundation’s Advanced Technological Education (ATE) program? Then this is a webinar you don’t want to miss! We will cover the essential elements of an effective evaluation plan and show you how to integrate them into an ATE proposal. We will also provide guidance on how to budget for an evaluation, locate a qualified evaluator, and use evaluative evidence to describe the results from prior NSF funding. Participants will receive the Evaluation Planning Checklist for ATE Proposals and other resources to help integrate evaluation into their ATE proposals.

An extended 30-minute Question and Answer session will be included at the end of this webinar. So, come prepared with your questions!

 

Resources:
Slides
External Evaluator Visual
External Evaluator Timeline
ATE Evaluation Plan Checklist
ATE Evaluation Plan Template
Guide to Finding and Selecting an ATE Evaluator
ATE Evaluator Map
Evaluation Data Matrix
NSF Evaluator Biosketch Template
NSF ATE Program Solicitation
Question and Answer Panel Recording

Webinar: Getting Everyone on the Same Page: Practical Strategies for Evaluator-Stakeholder Communication

Posted on May 1, 2019 by , , in Webinars ()

Presenter(s): Kelly Robertson, Lyssa Wilson Becho, Michael Lesiecki
Date(s): May 22, 2019
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/vld5Z9ZLxD4

To ensure high-quality evaluation, evaluators and project staff must collaborate on evaluation planning and implementation. Whether at the proposal stage or the official start of the project, setting up a successful dialog begins at the very first meeting between evaluators and project staff and continues throughout the duration of the evaluation. Intentional conversations and planning documents can help align expectations for evaluation activities, deliverables, and findings. In this webinar, participants will learn about innovative and practical strategies to improve communication between those involved in evaluation planning, implementation, and use. We will describe and demonstrate strategies developed from our own evaluation practice for

  • negotiating evaluation scope
  • keeping project staff up-to-date on evaluation progress and next steps
  • insuring timely report development
  • establishing and maintaining transparency
  • facilitating use of evaluation results.

Resources:
Slides
Handouts

Checklist: Do’s and Don’ts: Basic Principles of Data Visualization

Posted on March 26, 2019 by , in

A quick guide goes over the 14 do’s and don’ts of data visualization. This guide is not intended to teach these do’s and don’ts but rather serve as a reminder.

File: Click Here
Type: Doc
Category: Reporting & Use
Author(s): Emma Leeburg, Lyssa Wilson Becho