Emma Leeburg

Project Manager, Western Michigan University

Emma Leeburg is a project manager at The Evaluation Center at Western Michigan University and is the Managing Director and co-principal investigator for EvaluATE, the evaluation hub for the National Science Foundation’s Advanced Technological Education (ATE) program. She is the co-creator of creating one-page reports and specializes in data communication and visualization. She has over seven years of evaluation experience, presenting in webinars and workshops for national and international audiences, developing resources, newsletters, and reports.


Webinar: Making the Most of Your Evaluation: How to Use Evaluation Findings to Benefit Your Project

Posted on November 9, 2020 by , in Webinars

Presenter(s): Emma Leeburg, Lyssa Wilson Becho
Date(s): December 2, 2020
Time: 1p.m.- 2p.m. Eastern

Webinar Title: Making the Most of Your Evaluation: How to Use Evaluation Findings to Benefit Your Project

Join this webinar to learn how evaluation findings can be put to use for the benefit of ATE projects. We will address how evaluations can help projects adjust in uncertain times, how to integrate evaluation findings into your annual report to NSF, and how evaluation can help you think more intentionally about your long-term goals. Additionally, we will share real-world examples of use and utility of evaluation. Whether you’re an evaluator, PI, project staff, or grants professional, you’ll leave this webinar with new ideas for how to translate evaluation results into tangible benefits for ATE projects.

Resources:

Webinar: How to Avoid Common Pitfalls When Writing Evaluation Plans for ATE Proposals

Posted on July 28, 2020 by , , in Webinars ()

Presenter(s): Anastasia Councell, Emma Leeburg, Lyssa Wilson Becho
Date(s): August 19, 2020
Time: 1 p.m. – 2 p.m. Eastern
Recording: https://youtu.be/LTMShY2tM0o

Join this webinar to learn what pitfalls to watch out for when writing evaluation plans for grant proposals! In this webinar, we will share some of the biggest mistakes made in evaluation plans for ATE proposals and how to fix them. This webinar will go beyond EvaluATE’s current checklist for writing evaluation plans to highlight the good and the bad from real-world examples. Grant writers, project staff, and evaluators are encouraged to attend! Those completely new to grant writing may want to review the basic elements of an evaluation plan in our short video series prior to attending this webinar.

Resources:
Slides
Toolkit for Writing Evaluation Plans for ATE Proposals
Blog: Kirkpatrick Model for ATE Evaluation
Blog: Three Questions to Spur Action from Your Evaluation Report
Video Series: Evaluation: The Secret Sauce

Checklist: Project Vita

Posted on April 6, 2020 by  in ()

This checklist is designed to help with the creation of a project vita. Similar to an individual’s professional vita or resume, a project vita is a comprehensive index of factual information about a project’s activities and achievements. It documents past performance and demonstrates capacity for future endeavors. Tracking this information over the life of a project will make it easier to complete annual reports to sponsors, respond to information requests, and document achievements in funding applications. If the document is easy to find on the project’s website, stakeholders and other interest parties can easily see how productive (or not) the project has been. For a more dynamic vita, include links to supporting documents, staff biographies, or related web pages; this will allow users to quickly locate items referenced in the vita. As an example, here is our project vita. This checklist suggests what to include in a vita and how to organize the information. Projects should tailor their vitae to their specific needs.

FRONT MATTER
This section should provide the basic details of the project at a glance.

  • Project name
  • Project logo
  • Website address
  • Phone number
  • Institutional name
  • Institutional logo(s)
  • Grant number(s)
  • Funder’s logo(s)

 

PURPOSE
Use this section to convey the project’s overall purpose.

  • Mission
  • Vision
  • Goals

 

FUNDING
Identify each grant, contract, or donation.

  • Total amount project received
  • Years funded
  • Value per award
  • Sponsor/funder for each award

 

FACILITIES, EQUIPMENT, AND OTHER RESOURCES
List any specialized facilities and equipment that were purchased/upgraded with project funds.

  • Technical Instruments
  • Lab Facilities

 

ACTIVITIES AND PRODUCTS
List all key project activities and products, such as publications, events, courses, and presentations. Use a consistent reference style. Use subheadings to group similar items.

  • Presenter(s)/author(s)
  • Date
  • Title
  • Publisher information
  • Event venue and location

 

PEOPLE
List all individuals who have served as project staff, as well as advisors, consultants, and contributors.

Staff

    • Name
    • Position (e.g., principal investigator, data analyst, doctoral associate)
    • Dates on project

Advisory Committee Members

    • Name
    • Institution
    • Dates on project

Contributors and Consultants

    • Name
    • Institution
    • Date(s) of contributions
    • Type of contribution (blog author, external evaluator, industry partner

To learn more about project vitae and their uses, see:

Smith, N. L., & Florini, B. M. (1993). The project vita as a documentation and evaluation tool for large-scale research and development projects. Evaluation and Program Planning16(1), 49-53.

Download

Checklist: Project Vita

Checklist: Getting Started with your Evaluation

Posted on March 31, 2020 by  in

Updated March 31, 2020

Getting started with your ATE evaluation, a checklist on the basics of starting your evaluation.

Evaluation Basics

    • Ask important questions about the project’s processess and outcomes
    • Gather evidence that will help answer those questions
    • Interpret findings and answer the evaluation questions
    • Use the information for accountability, improvement, and planning
    • Continue this process throughout the life of your project

Resource
Evaluation Primer
Evaluation Basics Video Series
Evaluation: The Secret Sauce To Your ATE Proposal (Video 1- Why Evaluation?Video 3- Evaluation Questions, Video 4- Data)
Data Collection Planning Matrix
Evaluation Process

Using Evaluation

    • Improve your project
    • Inform stakeholders
    • Fulfill grant requirements (annual report)

Resource
Evaluation: The Secret Sauce To Your ATE Proposal (Video 5- Communication, Use, and Timeline)
Expectations to Change (E2C)
Stakeholder Engagement
Overcoming Writer’s Block- Strategies for Writing Your NSF Annual Report

Working with Your Evaluator

    • Make sure your evaluator’s contract is in place
    • Assign a point-person on your project team for evaluation matters
    • Schedule a recurring meeting with your evaluator
    • Make an appointment with your college’s data person
    • Set up a timeline for your evaluation
    • Commit to using your evaluation results

Resource
Principal Investigator “To Do” Checklist: Before Launching Your Project Evaluation
Communication Plan Checklist for ATE Principal Investigators and Evaluators
Evaluation: The Secret Sauce To Your ATE Proposal (Video 5- Communication, Use, and Timeline)

Webinar: Evaluation: The Secret Sauce in Your ATE Proposal

Posted on July 3, 2019 by , , in Webinars

Presenter(s): Emma Perk, Lyssa Wilson Becho, Michael Lesiecki
Date(s): August 21, 2019
Time: 1:00pm-2:30pm Eastern
Recording: https://youtu.be/XZCfd7m6eNA

Planning to submit a proposal to the National Science Foundation’s Advanced Technological Education (ATE) program? Then this is a webinar you don’t want to miss! We will cover the essential elements of an effective evaluation plan and show you how to integrate them into an ATE proposal. We will also provide guidance on how to budget for an evaluation, locate a qualified evaluator, and use evaluative evidence to describe the results from prior NSF funding. Participants will receive the Evaluation Planning Checklist for ATE Proposals and other resources to help integrate evaluation into their ATE proposals.

An extended 30-minute Question and Answer session will be included at the end of this webinar. So, come prepared with your questions!

 

Resources:
Slides
External Evaluator Visual
External Evaluator Timeline
ATE Evaluation Plan Checklist
ATE Evaluation Plan Template
Guide to Finding and Selecting an ATE Evaluator
ATE Evaluator Map
Evaluation Data Matrix
NSF Evaluator Biosketch Template
NSF ATE Program Solicitation
Question and Answer Panel Recording

Checklist: Do’s and Don’ts: Basic Principles of Data Visualization

Posted on March 26, 2019 by , in

A quick guide goes over the 14 do’s and don’ts of data visualization. This guide is not intended to teach these do’s and don’ts but rather serve as a reminder.

File: Click Here
Type: Doc
Category: Reporting & Use
Author(s): Emma Leeburg, Lyssa Wilson Becho

Blog: Repackaging Evaluation Reports for Maximum Impact

Posted on March 20, 2019 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Emma Perk Lyssa Wilson Becho
Managing Director
EvaluATE
Research Manager
EvaluATE

Evaluation reports take a lot of time to produce and are packed full of valuable information. To get the most out of your reports, think about “repackaging” your traditional report into smaller pieces.

Repackaging involves breaking up a long-form evaluation report into digestible pieces to target different audiences and their specific information needs. The goals of repackaging are to increase stakeholders’ engagement with evaluation findings, increase their understanding, and expand their use.

Let’s think about how we communicate data to various readers. Bill Shander from Beehive Media created the 4×4 Model for Knowledge Content, which illustrates different levels at which data can be communicated. We have adapted this model for use within the evaluation field. As you can see below, there are four levels, and each has a different type of deliverable associated with it. We are going to walk through these four levels and how an evaluation report can be broken up into digestible pieces for targeted audiences.

Figure 1. The four levels of delivering evaluative findings (image adapted from Shander’s 4×4 Model for Knowledge Content).

The first level, the Water Cooler, is for quick, easily digestible data pieces. The idea is to intrigue your viewer to want to learn more using a single piece of data from your report. Examples include a headline in a newspaper, a postcard, or social media post. In a social media post, you should include a graphic (photo or graph), a catchy title, and a link to the next communication level’s document. This information should be succinct and exciting. Use this level to catch the attention of readers who might not otherwise be invested in your project.

Figure 2. Example of social media post at the Water Cooler level.

The Café level allows you to highlight three to five key pieces of data that you really want to share. A Café level deliverable is great for busy stakeholders who need to know detailed information but don’t have time to read a full report. Examples include one-page reports, a short PowerPoint deck, and short briefs. Make sure to include a link to your full evaluation report to encourage the reader to move on to the next communication level.

Figure 3. One-page report at the Café level.

The Research Library is the level at which we find the traditional evaluation report. Deliverables at this level require the reader to have an interest in the topic and to spend a substantial amount of time to digest the information.

Figure 4. Full evaluation report at the Research Library level.

The Lab is the most intensive and involved level of data communication. Here, readers have a chance to interact with the data. This level goes beyond a static report and allows stakeholders to personalize the data for their interests. For those who have the knowledge and expertise in creating dashboards and interactive data, providing data at the Lab level is a great way to engage with your audience and allow the reader to manipulate the data to their needs.

Figure 5: Data dashboard example from Tableau Public Gallery (click image to interact with the data).

We hope this blog has sparked some interest in the different ways an evaluation report can be repackaged. Different audiences have different information needs and different amounts of time to spend reviewing reports. We encourage both project staff and evaluators to consider who their intended audience is and what would be the best level to communicate their findings. Then use these ideas to create content specific for that audience.

Blog: Evaluation Reporting with Adobe Spark

Posted on March 8, 2019 by , , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Ouen Hunter Emma Leeburg Michael Harnar
Doctoral Student
The Evaluation Center
Project Manager
EvaluATE
Assistant Professor of Interdisciplinary
Ph.D. in Evaluation
The Evaluation Center

This blog was originally published on AEA365 on December 28, 2018: https://aea365.org/blog/evaluation-reporting-with-adobe-spark-by-ouen-hunter-and-emma-perk/

Hi! We are Ouen Hunter (student at the Interdisciplinary Ph.D. in Evaluation Program, IDPE), Emma Perk (project manager at The Evaluation Center), and Michael Harnar (assistant professor at the IDPE) from Western Michigan University. Recently, we used PhotoVoice in our evaluation of an Upward Bound program and wanted to share how we reported our PhotoVoice findings using the cost-free version of Adobe Spark.

Adobe Spark offers templates to make webpages, videos, flyers, reports, and more. It also hosts your product online for free. While there is a paid version of Adobe Spark, everything we discuss in this blog can be done using the free version. The software is very straightforward, and we were able to get our report online within an hour. We chose to create a webpage to increase accessibility for a large audience.

The free version of Adobe Spark has a lot of features, but it can be difficult to customize the layout. Therefore, we created our layouts in PowerPoint then uploaded them to Spark. This enabled us to customize the font, alignment, and illustrations. Follow these instructions to create a similar webpage:

  • Create a slide deck in PowerPoint. Use one slide per photo and text from the participant. The first slide serves as a template for the rest.
  • After creating the slides, you have a few options for saving the photos for upload.
    1. Use a snipping tool (Windows’ snipping or Mac’s screenshot function) to take a picture of each slide and save it as a PNG file.
    2. Save each as a picture in PowerPoint by selecting the image and the speech bubble, right clicking, and saving as a picture.
    3. Export as a PNG in PowerPoint. Go to File > Export then select PNG under the File Format drop-down menu. This will save all the slides as individual image files.
  • Create a webpage in Adobe Spark.
          1. Once on the site, you will be prompted to start a new account (unless you’re a returning user). This will allow your projects to be stored and give you access to create in the software.
          2. You have the option to change the theme to match your program or branding by selecting the Theme button.
          3. Once you have selected your theme, you are ready to add a title and upload the photos you created from PowerPoint. To upload the photos, press the plus icon. 
          4. Then select Photo. 
          5. Select Upload Photo. Add all photos and confirm the arrangement.
          6. After finalizing, remember to post the page online and click Share to give out the link. 

Though we used Adobe Spark to share our PhotoVoice results, there are many applications for using Spark. We encourage you to check out Adobe Spark to see how you can use it to share your evaluation results.

Hot Tips and Features:

  • Adobe Spark adjusts automatically for handheld devices.
  • Adobe Spark also automatically adjusts lines for you. No need to use a virtual ruler.
  • There are themes available with the free subscription, making it easy to design the webpage.
  • Select multiple photos during your upload. Adobe Spark will automatically separate each file for you.

*Disclaimer: Adobe Spark didn’t pay us anything for this blog. We wanted to share this amazing find with the evaluation community!

Blog: PhotoVoice: A Method of Inquiry in Program Evaluation

Posted on January 25, 2019 by , , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

 

Ouen Hunter Emma Perk Michael Harnar
Doctoral Student
The Evaluation Center
Managing Director
EvaluATE
Assistant Professor of Interdisciplinary
Ph.D. in Evaluation
The Evaluation Center

Hello, EvaluATE! We are Ouen Hunter (student at the Interdisciplinary Ph.D. in Evaluation, IDPE), Emma Perk (co-PI of EvaluATE at The Evaluation Center), and Michael Harnar (assistant professor at the IDPE) from Western Michigan University. We recently used PhotoVoice in our evaluation of a Michigan-based Upward Bound (UB) program (a college preparation program focused on 14- to 19-year-old youth living in low-income families in which neither parent has a bachelor’s degree).

PhotoVoice is a method of inquiry that engages participants in creating photographs and short captions in response to specific prompts. The photos and captions provide contextually grounded insights that might otherwise be unreachable by those not living that experience. We opted to use PhotoVoice because the photos and narratives could provide insights into participants’ perspectives that cannot be captured using close-ended questionnaires.

We created two prompts, in the form of questions, and introduced PhotoVoice in person with the UB student participants (see the instructional handout below). Students used their cell phones to take one photo per prompt. For confidentiality reasons, we also asked the students to avoid taking pictures of human faces. Students were asked to write a two- to three-sentence caption for each photo. The caption was to include a short description of the photo, what was happening in the photo, and the reason for taking the photo.

PhotoVoice handout

Figure 1: PhotoVoice Handout

PhotoVoice participation was part of the UB summer programming and overseen by the UB staff. Participants had two weeks to complete the tasks. After receiving the photographs and captions, we analyzed them using MAXQDA 2018. We coded the pictures and the narratives using an inductive thematic approach.

After the preliminary analysis, we then went back to our student participants to see if our themes resonated with them. Each photo and caption was printed on a large sheet of paper (see figure 2 below) and posted on the wall. During a gallery walk, students were asked to review each photo and caption combination and to indicate whether they agree or disagree with our theme selections (see figure 3). We gave participants stickers and asked them to place the stickers in either the “agree” or “disagree” section on the bottom of each poster. After the gallery walk, we discussed the participants’ ratings to understand their photos and write-ups better.

Figure 2: Gallery walk layout (photo and caption on large pieces of paper)

Figure 3: Participants browsing the photographs

Using the participants’ insights, we finalized the analysis, created a webpage, and developed a two-page report for the program staff. To learn more about our reporting process, see our next blog. Below is a diagram of the activities that we completed during the evaluation.

Figure 4: Activities conducted in the Upward Bound evaluation

The PhotoVoice activity provided us with rich insights that we would not have received from the survey that was previously used. The UB student participants enjoyed learning about and being a part of the evaluation process. The program staff valued the reports and insights the method provided. The exclusion of faces in the photographs enabled us to avoid having to obtain parental permission to release the photos for use in the evaluation and by UB staff. Having the students use cell phone cameras kept costs low. Overall, the evaluation activity went over well with the group, and we plan to continue using PhotoVoice in the future.