Sorry, no biography is available for EvaluATE.

Newsletter: 2017 Spring

Posted on April 26, 2017 by  in Newsletter ()

Printer Friendly Version

This issue of EvaluATE’s newsletter is all about reporting. It highlights a blog on how to include evaluation results in Advanced Technological Education (ATE) annual reports to NSF, resources on alternative report formats, data visualization resources, and tips for throwing a party (a data party, that is).

Reporting Project Evaluation Results in NSF Annual Reports

National Science Foundation grantees are required to submit annual reports through Research.gov. ATE principal investigators should include information from their ATE project’s or center’s evaluation. But when you look at the required sections, you will not see one that says, “Evaluation Results.” That would be too easy! Lori Wingate’s recent blog, “What Goes Where? Reporting Evaluation Results to NSF,” offers straightforward guidance for this task.

Reimagining Evaluation Reports

A detailed technical report is by far the most common means of communicating evaluation results. If you need something different to share with your project’s stakeholders, get some fresh ideas from BetterEvaluation’s overview and resources on alternative reporting media. These include newsletters, postcards, web conferences, posters, videos, cartoons, and infographics—just to name a few.  If you just need an efficient way to report project facts in a no-nonsense manner, try naked reporting – a way of communicating essential project information with minimal descriptive text.

Data Visualization Resources

Great data visualization can make a report more readable and understandable. Bad data visualization can make it confusing and seem unprofessional. Whether reporting within your organization or to external audiences, good charts can help communicate project achievements. Check out Ann Emery’s instructional videos to boost your visualization skills using Excel. To make sure you are getting it right, review your work against the Data Visualization Checklist, by Stephanie Evergreen and Ann Emery. Both of their websites include lots of other helpful information on data visualization.

Party Time!

That’s right, time for a “data party.” Sharing data with stakeholders doesn’t have to be boring. In “Have a Party to Share Evaluation Results,” an AEA365 blog by Kendra Lewis, you’ll learn about gallery walks, data placemats, and other fun and memorable ways to engage stakeholders in reviewing and making sense of evaluation data.

Evaluation Reporting Checklist

Are you writing an evaluation report? Check out EvaluATE’s Evaluation Reporting Checklist – Version 1.1 (a new version is being developed – stay tuned). This checklist is full of helpful guidance for developing comprehensive and straightforward evaluation reports. EvaluATE is currently looking for a few volunteers to pilot Version 1.2 and provide feedback. If you are interested, please send an email to kelly.robertson@wmich.edu – you’ll be among the first to review the new and improved checklist and help shape the final product. Just want to learn more about evaluation reporting?  Watch the recording of EvaluATE’s December 2016 webinar, Anatomy of a User-Friendly Evaluation Report.

 


American Evaluation Association (AEA) Summer Evaluation Institute

This year’s AEA Summer Evaluation Institute is June 4-7 in Atlanta, Georgia. EvaluATE’s director, Lori Wingate, is giving two workshops on Identifying Evaluation Questions. You can also learn about evaluation theory, survey design, logic modeling, evaluating collaborations, data visualization, and much more from a great line-up of instructors.

Did you miss our last webinar? Get the slides, recording, and handout here.

Recent EvaluATE Blogs:

 See more blogs at evalu-ate.org/blog

Newsletter: 2017 Winter

Posted on January 18, 2017 by  in Newsletter ()

Printer Friendly Version

ADAPTING EVALUATION DESIGN TO DATA REALITIES

“What is your biggest challenge working as an ATE evaluator?” Twenty-three evaluators who applied for funding from EvaluATE to attend the 2016 Advanced Technological Education Principal Investigators Conference gave us their opinions on that topic. One of the most common responses was along the lines of “insufficient data.” In this issue of EvaluATE’s newsletter, we highlight resources that evaluators and project staff can turn to when plans need to be adjusted to ensure an evaluation has adequate data. (Another common theme was “communication between project and evaluation personnel,” but that’s for a future newsletter issue).

Scavenge Data

One of the biggest challenges many evaluators encounter is getting people to participate in data collection efforts, such as surveys and focus groups. In her latest contribution to EvaluATE’s blog, Lori Wingate discusses Scavenging Evaluation Data. She identifies two ways to get useful data that don’t require the cooperation of project participants.

Get Real

RealWorld Evaluation is a popular text among evaluators because the authors recognize that evaluations are often conducted under less-than-ideal circumstances with limited resources. Check out the companion website, which includes a free 20-page PDF summary of the book.

Check Timing When Changing Plans

For ATE projects, it is OK to use data collection methods that were not included in the original evaluation plan—as long as there is a good rationale. But be realistic about how much time it takes to develop new data collection instruments and protocols. For a reality check, see the Time Frame Estimates for Common Data Collection Activities in Guidelines for Working with Third-Party Evaluators.

Repurpose Existing Data

Having trouble getting data from project participants? Try using secondary data to supplement your primary evaluation data. In  Look No Further: Potential Sources of Institutional Data, institutional research professionals from the University of Washington Bothell describe several types of institutional data that can be used in project evaluations at colleges and universities.

Upcoming Webinars

Did you miss our recent webinars?

Check out the slides, handouts, and recordings from our August and December webinars:

Want to receive our newsletter via email?

Join our mailing list

Newsletter: 2016 Fall

Posted on October 19, 2016 by  in Newsletter () ()

Printer Friendly Version

Happy New Year!

The calendar year may be coming to a close, but a new academic year just started and many ATE program grantees recently received their award notifications from the National Science Foundation. ‘Tis the season to start up or revisit evaluation plans for the coming year. This digital-only issue of EvaluATE’s newsletter is all about helping project leaders and evaluators get the new evaluation year off on the right track.

Don’t launch (or relaunch) your evaluation before taking these steps

launch

Mentor-Connect’s one-page checklist tells project leaders what they need to do to set the stage for a successful evaluation.

You won’t hear this from anyone else

3truths

EvaluATE’s director, Lori Wingate, shares Three Inconvenient Truths about ATE Evaluation in her latest contribution to the EvaluATE blog. You may find them unsettling, but ignorance is not bliss when it comes to these facts about evaluation.

Is your evaluation on track?

track

Use the Evaluation Progress Checklist to make sure your evaluation is on course. It’s on pages 26-28 in Westat’s Guidelines for Working with Third Party Evaluators, which also includes guidance for resolving problems and other tips for nonevaluators.

Myth: All evaluation stakeholders should be engaged equally

equal

Monitor, facilitate, consult, or co-create? Use our stakeholder identification worksheet to figure out the right way to engage different types of stakeholders in your evaluation.

EvaluATE at the ATE PI Conference: October 26-29

A Practical Approach to Outcome Evaluation: Step-by-Step
WORKSHOP: Wednesday 1-4 p.m.
DEMONSTRATION: Thursday 4:45-5:15 p.m.

SHOWCASES: We will be at all three showcase sessions.

Check out the conference program.

Next Webinar

slider-dec16-webinar

Did you miss our recent webinars?

Check out slides, handouts, and recordings

0816tile 0516tile

Shape the future of EvaluATE

EvaluATE has been refunded for another 4 years! Let us know how you would like us to invest our resources to advance evaluation in the ATE program.

Complete our two-minute survey today.

Want to receive our newsletter via email?

joinmailinglist2

Checklist: The Common Guidelines for Education Research and Development

Posted on January 28, 2015 by  in

This document includes a series of six checklists—one for each of the six types of research outlined in the Common Guidelines for Education Research and Development. The Guidelines, developed by the Institute of Education Sciences at the U.S. Department of Education and the National Science Foundation, explains those agencies’ shared expectations for education research and development. The checklists, created by EvaluATE, are distillations of key points from the Guidelines. The checklists are intended to support use of the Guidelines, enabling users to quickly reference a type of research and determine whether they are following guideline’s expectations. As such, they provide an overview and orientation to the Guidelines. They do not replace that report nor do they expand or elaborate on the report’s content. The checklists’ content has been extracted (usually verbatim) from the full report. All checklist users are strongly encouraged to read the complete Guidelines, available from http://bit.ly/nsf-ies_guide.

A graphic overview of the Common Guidelines is available here.

You may download the entire guidelines checklist or go directly to the checklist for each type of research by clicking on the links below:

1. Foundational Research to advance the frontiers of education and learning; develop and refine theory and methodology; and provide fundamental knowledge about teaching and/or learning

2. Early-Stage or Exploratory Research to investigate approaches to education problems to establish the basis for design and development of new interventions or strategies and/or to provide evidence for whether an established intervention or strategy is ready to be tested in an efficacy study

3. Design and Development Research to develop new or improved interventions or strategies to achieve well-specified learning goals or objectives, including making refinements on the basis of small-scale testing

4. Efficacy Research to determine whether an intervention or strategy can improve outcomes under “ideal” conditions (e.g., with more implementation support, highly trained personnel, and/or more homogenous participants than is typical)

5. Effectiveness Research to estimate the impacts of an intervention or strategy when implemented under conditions of routine practice (i.e., conditions similar to what would occur if a study were not being conducted)

6. Scale-Up Research to estimate the impacts of an intervention or strategy under conditions of routine practice and across a broad spectrum of populations and settings, sufficiently diverse to broadly generalize findings

Checklist: Common Guidelines: Type 1 – Foundational Research

Posted on December 10, 2014 by  in

This checklist is a distillation of key points from the Common Guidelines for Education Research and Development regarding Foundational Research. The Guidelines, developed by the Institute of Education Sciences at the U.S. Department of Education and the National Science Foundation, explains those agencies’ shared expectations for education research and development. This checklist, created by EvaluATE, is intended to support use of the Guidelines, enabling users to quickly reference those that specifically relate to Foundational Research. As such, it provides an overview and orientation to the Guidelines. It does not replace the Guidelines nor does it expand or elaborate on that report’s content. The checklist’s content has been extracted (usually verbatim) from the full report. All checklist users are strongly encouraged to read the complete Guidelines.

The complete guide and checklists for the other five types of research outlined in the Guidelines are available here.

A graphic overview of the Common Guidelines is available here.

TYPE 1: FOUNDATIONAL RESEARCH to advance the frontiers of education and learning; develop and refine theory and methodology; and provide fundamental knowledge about teaching and/or learning.

File: Click Here
Type: Checklist
Category: Resources
Author(s): EvaluATE

Checklist: Common Guidelines: Type 2 – Early Stage or Exploratory Research

Posted on December 10, 2014 by  in

This checklist is a distillation of key points from the Common Guidelines for Education Research and Development regarding Early-Stage or Exploratory Research. The Guidelines, developed by the Institute of Education Sciences at the U.S. Department of Education and the National Science Foundation, explains those agencies’ shared expectations for education research and development. This checklist, created by EvaluATE, is intended to support use of the Guidelines, enabling users to quickly reference those that specifically relate to Foundational Research. As such, it provides an overview and orientation to the Guidelines. It does not replace the Guidelines nor does it expand or elaborate on that report’s content. The checklist’s content has been extracted (usually verbatim) from the full report. All checklist users are strongly encouraged to read the complete Guidelines.

The complete guide and checklists for the other five types of research outlined in the Guidelines are available here.

A graphic overview of the Common Guidelines is available here.

TYPE 2: EARLY-STAGE OR EXPLORATORY RESEARCH to investigate approaches to education problems to establish the basis for design and development of new interventions or strategies and/or to provide evidence for whether an established intervention or strategy is ready to be tested in an efficacy study.

File: Click Here
Type: Checklist
Category: Resources
Author(s): EvaluATE

Checklist: Common Guidelines: Type 3 – Design and Development Research

Posted on December 10, 2014 by  in

This checklist is a distillation of key points from the Common Guidelines for Education Research and Development regarding Design and Development Research. The Guidelines, developed by the Institute of Education Sciences at the U.S. Department of Education and the National Science Foundation, explains those agencies’ shared expectations for education research and development. This checklist, created by EvaluATE, is intended to support use of the Guidelines, enabling users to quickly reference those that specifically relate to Foundational Research. As such, it provides an overview and orientation to the Guidelines. It does not replace the Guidelines nor does it expand or elaborate on that report’s content. The checklist’s content has been extracted (usually verbatim) from the full report. All checklist users are strongly encouraged to read the complete Guidelines.

The complete guide and checklists for the other five types of research outlined in the Guidelines are available here.

A graphic overview of the Common Guidelines is available here.

TYPE 3: DESIGN AND DEVELOPMENT RESEARCH to develop new or improved interventions or strategies to achieve well-specified learning goals or objectives, including making refinements on the basis of small-scale testing

File: Click Here
Type: Checklist
Category: Resources
Author(s): EvaluATE

Checklist: Common Guidelines: Type 4 – Efficacy Research

Posted on December 10, 2014 by  in

This checklist is a distillation of key points from the Common Guidelines for Education Research and Development regarding Efficacy Research. The Guidelines, developed by the Institute of Education Sciences at the U.S. Department of Education and the National Science Foundation, explains those agencies’ shared expectations for education research and development. This checklist, created by EvaluATE, is intended to support use of the Guidelines, enabling users to quickly reference those that specifically relate to Foundational Research. As such, it provides an overview and orientation to the Guidelines. It does not replace the Guidelines nor does it expand or elaborate on that report’s content. The checklist’s content has been extracted (usually verbatim) from the full report. All checklist users are strongly encouraged to read the complete Guidelines.

The complete guide and checklists for the other five types of research outlined in the Guidelines are available here.

A graphic overview of the Common Guidelines is available here.

TYPE 4: EFFICACY RESEARCH to determine whether an intervention or strategy can improve outcomes under “ideal” conditions (e.g., with more implementation support, highly trained personnel, and/or more homogenous participants than is typical).

File: Click Here
Type: Checklist
Category: Resources
Author(s): EvaluATE

Checklist: Common Guidelines: Type 5 – Effectiveness Research

Posted on December 10, 2014 by  in

This checklist is a distillation of key points from the Common Guidelines for Education Research and Development regarding <strong>Effectiveness Research</strong>. The Guidelines, developed by the Institute of Education Sciences at the U.S. Department of Education and the National Science Foundation, explains those agencies’ shared expectations for education research and development. This checklist, created by EvaluATE, is intended to support use of the Guidelines, enabling users to quickly reference those that specifically relate to Foundational Research. As such, it provides an overview and orientation to the Guidelines. It does not replace the Guidelines nor does it expand or elaborate on that report’s content. The checklist’s content has been extracted (usually verbatim) from the full report. All checklist users are strongly encouraged to read the complete Guidelines.

The complete guide and checklists for the other five types of research outlined in the Guidelines are available here.

A graphic overview of the Common Guidelines is available here.

TYPE 5: EFFECTIVENESS RESEARCH to estimate the impacts of an intervention or strategy when implemented under conditions of routine practice (i.e., conditions similar to what would occur if a study were not being conducted).

File: Click Here
Type: Checklist
Category: Resources
Author(s): EvaluATE