Emma Perk

Project Manager Research, The Evaluation Center at Western Michigan University

Emma Perk is co-principal investigator and project manager for EvaluATE, the evaluation support center for the National Science Foundation’s Advanced Technological Education (ATE) program. She has five years of evaluation experience, presenting in webinars and workshops for national and international audiences, developing resources, newsletters, and reports. She oversees production of the newsletter, coordinates our webinars, and works with both our CCLP and NVC. She also coordinates outreach with our contributing authors. She has a degree in Anthropology and specializes in graphic design, data visualization, and project coordination.


Blog: PhotoVoice: A Method of Inquiry in Program Evaluation

Posted on January 25, 2019 by , , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Hello, EvaluATE! We are Ouen Hunter (student at the Interdisciplinary Ph.D. in Evaluation, IDPE), Emma Perk (co-PI of EvaluATE at The Evaluation Center), and Michael Harnar (assistant professor at the IDPE) from Western Michigan University. We recently used PhotoVoice in our evaluation of a Michigan-based Upward Bound (UB) program (a college preparation program focused on 14- to 19-year-old youth living in low-income families in which neither parent has a bachelor’s degree).

PhotoVoice is a method of inquiry that engages participants in creating photographs and short captions in response to specific prompts. The photos and captions provide contextually grounded insights that might otherwise be unreachable by those not living that experience. We opted to use PhotoVoice because the photos and narratives could provide insights into participants’ perspectives that cannot be captured using close-ended questionnaires.

We created two prompts, in the form of questions, and introduced PhotoVoice in person with the UB student participants (see the instructional handout below). Students used their cell phones to take one photo per prompt. For confidentiality reasons, we also asked the students to avoid taking pictures of human faces. Students were asked to write a two- to three-sentence caption for each photo. The caption was to include a short description of the photo, what was happening in the photo, and the reason for taking the photo.

PhotoVoice handout

Figure 1: PhotoVoice Handout

PhotoVoice participation was part of the UB summer programming and overseen by the UB staff. Participants had two weeks to complete the tasks. After receiving the photographs and captions, we analyzed them using MAXQDA 2018. We coded the pictures and the narratives using an inductive thematic approach.

After the preliminary analysis, we then went back to our student participants to see if our themes resonated with them. Each photo and caption was printed on a large sheet of paper (see figure 2 below) and posted on the wall. During a gallery walk, students were asked to review each photo and caption combination and to indicate whether they agree or disagree with our theme selections (see figure 3). We gave participants stickers and asked them to place the stickers in either the “agree” or “disagree” section on the bottom of each poster. After the gallery walk, we discussed the participants’ ratings to understand their photos and write-ups better.

Figure 2: Gallery walk layout (photo and caption on large pieces of paper)

Figure 3: Participants browsing the photographs

Using the participants’ insights, we finalized the analysis, created a webpage, and developed a two-page report for the program staff. To learn more about our reporting process, see our next blog. Below is a diagram of the activities that we completed during the evaluation.

Figure 4: Activities conducted in the Upward Bound evaluation

The PhotoVoice activity provided us with rich insights that we would not have received from the survey that was previously used. The UB student participants enjoyed learning about and being a part of the evaluation process. The program staff valued the reports and insights the method provided. The exclusion of faces in the photographs enabled us to avoid having to obtain parental permission to release the photos for use in the evaluation and by UB staff. Having the students use cell phone cameras kept costs low. Overall, the evaluation activity went over well with the group, and we plan to continue using PhotoVoice in the future.

Evaluation Process

Posted on March 14, 2018 by , in

Highlights the four main steps of an ATE Evaluation, and provides detailed activities for each step. This example is an excerpt from the Evaluation Basics for Non-evaluators webinar. Access slides, recording, handout, and additional resources from bit.ly/mar18-webinar.

File: Click Here
Type: Doc
Category: Getting Started
Author(s): Emma Perk, Lori Wingate

Webinar: Creating One-Page Reports

Posted on March 13, 2018 by , in Webinars ()

Presenter(s): Emma Perk, Lyssa Becho
Date(s): April 18, 2018
Time: 1-2 p.m. Eastern
Recording: https://youtu.be/V2TBfz24RpY

One-page evaluation reports are a great way to provide a snapshot of a project’s activities and impact to stakeholders such as advisory groups, college administrators, and NSF program officers. Summarizing key evaluation facts in a format that is easily and quickly digestible engages the busy reader and can make your project stand out.

Although traditional, long-form evaluation reports are still an excellent way to distribute evaluation results, one-page reports increase the engagement, understanding, and use of evaluation for both the current grant and leveraging findings with potential follow-up grants.

In this webinar, we will provide you with the tools and resources you need to create effective one-page reports and share some examples that have worked well in our practice.

One-Page Report Resources

Resources:
10 steps to creating one-page reports
One-page report worksheet
Slides
South Seattle One-Page Report

Blog: One Pagers: Simple and Engaging Reporting

Posted on December 20, 2017 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Blog: One Pagers Simple and Engaging Reporting

Traditional, long-form reports are often used to detail the depth and specifics of an evaluation. However, many readers simply don’t have the time or bandwidth to digest a 30-page report. Distilling the key information into one page can help catch the eye of busy program staff, college administrators, or policy makers.

When we say “one pager,” we mean a single-page document that summarizes evaluation data, findings, or recommendations. It’s generally a stand-alone document that supplements a longer report, dataset, or presentation.

One pagers are a great way to get your client the data they need to guide data-driven decisions. These summaries can work well as companion documents for long reports or as a highlight piece for an interim report. We created a 10-step process to help facilitate the creation of a one pager. Additional materials are available, including detailed slides, grid layouts, videos, and more.

Ten-step process for creating a one pager:

1. Identify the audience

Be specific about who you are talking to and their information priorities. The content and layout of the document should be tailored to meet the needs of this audience.

2. Identify the purpose

Write a purpose statement that identifies why you are creating the one pager. This will help you decide what information to include or to exclude.

3. Prioritize your information

Categorize the information most relevant to your audience. Then rank each category from highest to lowest priority to help inform layout of the document.

4. Choose a grid

Use a grid to intentionally organize elements visually for readers. Check out our free pre-made grids, which you can use for your own one pagers, and instructions on how to use them in PowerPoint (video).

5. Draft the layout

Print out your grid layout and sketch your design by hand. This will allow you to think creatively without technological barriers and will save you time.

6. Create an intentional visual path

Pay attention to how the reader’s eye moves around the page. Use elements like large numbers, ink density, and icons to guide the reader’s visual path. Keep in mind the page symmetry and need to balance visual density. For more tips, see Canva’s Design Elements and Principles.

7. Create a purposeful hierarchy

Use headings intentionally to help your readers navigate and identify the content.

8. Use white space

The brain subconsciously views content grouped together as a cohesive unit. Add white space to indicate that a new section is starting.

9. Get feedback

Run your designs by a colleague or client to help catch errors, note areas needing clarification, and ensure the document makes sense to others. You will likely need to go through a few rounds of feedback before the document is finalized.

10. Triple-check consistency

Triple-check, and possibly quadruple-check, for consistency of fonts, alignment, size, and colors. Style guides can be a useful way to keep track of consistency in and across documents. Take a look at EvaluATE’s style guide here.

The demand for one pagers is growing, and now you are equipped with the information you need to succeed in creating one. So, start creating your one pagers now!

Report: 2017 ATE Annual Survey

Posted on December 6, 2017 by , , , in Annual Survey ()

Report: Advanced Technological Education 2017 Annual Survey

This report summarizes data gathered in the 2017 survey of ATE program grantees. Conducted by EvaluATE—the evaluation support center for the ATE program, located at The Evaluation Center at Western Michigan University—this was the 18th annual ATE survey. Included here are findings about funded projects and their activities, accomplishments, and impacts during the 2016 calendar year (2016 fiscal year for budget-related questions).

File: Click Here
Type: Report
Category: ATE Annual Survey
Author(s): Arlen Gullickson, Emma Perk, Lori Wingate, Lyssa Becho

Resource: NVC Handbook

Posted on November 28, 2017 by , , in NVC ()

The purpose of this handbook is to help those who are responsible for organizing, planning, or conducting NVC meetings. It is mainly intended for principal investigators (PIs), but other audiences include center staff, committee chairs, committee members, and others with responsibilities related to NVCs. The NVC Handbook is not intended to establish policy, nor does it necessarily apply to other NSF programs.

File: Click Here
Type: Doc
Category: Resources
Author(s): Emma Perk, Lori Wingate, Wayne Welch

Checklist: Project Vita

Posted on July 12, 2017 by  in

This checklist is designed to help with the creation of a project vita. Similar to an individual’s professional vita or resume, a project vita is a comprehensive index of factual information about a project’s activities and achievements. It documents past performance and demonstrates capacity for future endeavors. Tracking this information over the life of a project will make it easier to complete annual reports to sponsors, respond to information requests, and document achievements in funding applications. If the document is easy to find on the project’s website, stakeholders and other interest parties can easily see how productive (or not) the project has been. For a more dynamic vita, include links to supporting documents, staff biographies, or related web pages; this will allow users to quickly locate items referenced in the vita. For an example of a project vita, see evalu-ate.org/vita. This checklist suggests what to include in a vita and how to organize the information. Projects should tailor their vitae to their specific needs.

File: Click Here
Type: Checklist
Category: General
Author(s): Emma Perk

Resource: How to create a retrospective pre-post survey in Google Forms

Posted on March 21, 2017 by  in

We had a great question come in through Twitter-

“Q: Any clu how to structure items for retrospective pre-post surveys in ?”

A: Use a multiple choice grid.

See below for the steps or view the PDF.

2. Enter pre-post variables in row 1 and row 2

3. Enter measurement variables as columns

4. Preview survey question

5. Test the survey form

6. Launch survey and get results!

See it in Google Forms: https://goo.gl/forms/2si0dwb69foaTdKF3

Have questions? Feel free to contact Emma (emma.perk@wmich.edu).