Jason Burkhardt

EvaluATE Blog Editor

Jason is currently a project manager at the Evaluation Center at Western Michigan University. He is also a PhD student in the Interdisciplinary PhD in evaluation program. He enjoys music, art, and the finer things in life.


Blog: Reflections on the 2014 ATE PI Conference

Posted on November 6, 2014 by , , in Blog
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Well, the 2014 ATE PI conference has come and gone. First, let us say thank you to AACC for again hosting a wonderful conference. It was great to meet and share ideas with such an amazing group of professionals. We truly look forward to this conference every year. This blog entry contains the EvaluATE team’s reflections on our personal highlights from this year’s conference and hopes for next year.

Jason

Robots! Guitars! Technology in action! My biggest highlight was getting to see the variety of work that is being done within the ATE program. It can be easy in our day-to-day work to forget how important the work of the ATE program is, but we are truly at the forefront of technological education in the United States. I also enjoy getting to see people that I know from within the program. I hope that next year continues to see even more new amazing tech bits!

Lori

For me, the highlight of the conference was when ATE Program Co-Lead David Campbell quoted EvaluATE’s Emma Perk to a room of 100+ people: “The most important purpose of evaluation is not to prove, but to improve.” She shared this quotation from Daniel Stufflebeam in her portion of the Getting Started workshop the day before. (View Emma and Jason’s Getting Started slides). At next year’s conference, I hope there will be more presentations in the research and evaluation conference track. Participants in the preconference workshop on evaluation appreciated hearing about real-world evaluations and practical tips from seasoned ATE evaluators.  We need more of this at every ATE PI conference! (Check out the workshop slides by Candiya Mann, Amy Nisselle, and Bruce Nash).

Emma

This year was my first time attending the ATE PI conference. The showcase sessions were the highlight of the conference for me. I really enjoyed interacting with the different PIs and staff from all the projects and centers. It was great to learn more about the ATE community and how we can expand on what we offer to them as a resource center. EvaluATE’s showcase booth was situated between ATE Central and Mentor-Connect, so we were able to reinforce our great relationship with them and refer people to their useful resources. My hope for next year is to do an evaluation session or roundtable, focusing on identifying the needs of the ATE community.

Corey

Unfortunately I was unable to be at the ATE conference this year. I missed the opportunity to put faces to names. As the annual survey coordinator, I communicate with many of you over the course of the year, so it’s nice to meet some of you face-to-face at the conference. I enjoy being able to talk in person with individuals about the ATE annual survey, to hear concerns, listen to suggestions, and talk data. If you didn’t see the latest reports based on the 2014 survey—like our data snapshots on the representation of women and underrepresented minorities in ATE—check them out here: http://www.evalu-ate.org/annual_survey/

 

We look forward to seeing you all at the conference next year! For more highlights from this year’s conference, including pictures please visit our

Newsletter: Everyday Evaluation

Posted on October 1, 2014 by , , in Newsletter - ()

At EvaluATE, evaluation is a shared responsibility. We have a wonderful external evaluator, Dr. Lana Rucks, with whom we meet a few times a year in person and talk to on the phone about every other month. Dr. Rucks is responsible for determining our center’s mid- and long-term impact on the individuals who engage with us and on the ATE projects they influence. We supplement her external evaluation with surveys of workshop and webinar participants to obtain their immediate feedback on our activities. In addition, we carefully track the extent to which we are reaching our intended audiences. But for our team, evaluation is not just about the formal activities related to data collection and analysis. It’s how we do our work on a daily basis. Here are some examples:

  • Everyone gives and gets constructive criticism. Every presentation, webinar, newsletter article, or other product we create gets reviewed by the whole team. This improves our final products, whether it means catching embarrassing typos, completely revamping a presentation to improve its relevance, or going back to the drawing board. We all have thick skins and understand that criticism is not personal; it’s essential to high-quality work.
  • We are willing to admit when something’s not working or when we’ve bit off more than we can chew. We all realize it’s better to scrap an idea early and refocus rather than push it to completion with mediocre results.
  • We look backward when moving forward. For example, when we begin developing a new webinar, we review the feedback from the previous one to determine what our audiences perceived as its strengths and weaknesses. Perhaps the most painful, yet valuable exercise is watching the recording of a prior webinar together, stopping to note what really worked and what didn’t— from the details of audio quality to the level of audience participation.
  • We engage our advisors. Getting an external perspective on our work is invaluable. They ask us tough questions and cause us to check our assumptions.
  • We use data every day. Whether determining which social media strategies are most effective or identifying which subgroups within our ATE constituency need more attention, we use the data we have in hand to inform decisions about our operations and priorities.
  • We use our mission as a compass to plot our path forward. We are faced with myriad opportunities in the work that we do as a resource center. We consider options in terms of their potential to advance our mission. That keeps us focused and ensures that resources are expended on mission-critical efforts.

Integrating these and other evaluative activities and perspectives into our daily work gives us better results, as apparent in our formal evaluation results. Importantly, we share a belief that excellence is never achieved—it is something we continually strive for. What we did yesterday may have been pretty good, but we believe we can do better tomorrow.

As you plan your evaluation for this year, consider things you can do with your team to critique and improve your work on an ongoing basis.

Newsletter: Critical Friend

Posted on October 1, 2014 by  in Newsletter - ()

EvaluATE Blog Editor

The term critical friend describes a stance an evaluator can take in his or her relationship with the program or project they evaluate. Costa and Kallick (1993) provide this seminal definition: “A trusted person who asks provocative questions, provides data to be examined through another lens, and offers critique of a person’s work as a friend” (p.50).

The relationship between a project and an evaluator who is a critical friend is one where the evaluator has the best interests of the program at heart and the project staff trusts that this is the case. The evaluator may see their role as being both a trusted advisor and a staunch critic. He or she pushes the program to achieve its goals in the most effective way possible while maintaining independence. The evaluator helps the project staff to view information in different ways, while still being sensitive to the project staff’s own views and priorities. The evaluator will call attention to negative or less effective aspects of a project, but will do so in a constructive way. By pointing out potential pitfalls and flaws in the project, the critical friend evaluator can help the project to grow and improve.

To learn more…

Costa, A.L. & Kallick, B. (1993). Through the lens of a critical friend. Educational Leadership, 51(2) 49-51. http://bit.ly/crit-friend

Rallis, S. F., & Rossman, G. B. (2000). Dialogue for learning: Evaluator as critical friend. New Directions for Evaluation, 86, 81-92.

Blog: Welcome to the EvaluATE Blog!

Posted on September 26, 2014 by  in Blog ()

EvaluATE Blog Editor

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

The EvaluATE blog is a way for you, the ATE community members, to share your knowledge and experience, particularly as it relates to evaluation. The blog’s content is entirely community-driven, and features eight major topics, including:

• Evaluation Management
• Proposals
• Evaluation Design
• Data Collection, Analysis, and Interpretation
• Reporting
• Evaluation Use
• General Issues
• Discipline-Specific Issues

The ATE program has a reputation for having a strong community whose members collaborate and support each other for the greater good of improving advanced technological education. What are your ATE evaluation lessons learned? Can you help a new ATE PI or evaluator avoid common pitfalls? Do you know of a great evaluation instrument or strategy? Share your knowledge and help continue to build the ATE community by contributing to the EvaluATE blog. Submission guidelines may be found at , and we look forward to your submissions!

In the meantime, we hope you’ll take the opportunity to engage with other social media outlets.

LinkedIn – We are replacing the ATE evaluator directory with our LinkedIn page. If you are not a member already, make sure to sign up soon! This is a great way for evaluators and PIs to connect.

Twitter – Don’t miss out on being the first to hear about new developments in ATE and STEM education evaluation! We use Twitter to share notifications about our events and other great opportunities, updates from NSF, and interesting information about STEM evaluation in general. .

Facebook – Did you know that over 70 ATE projects and centers have Facebook pages? We have a lot of connections with the ATE and evaluation communities, so our page is a good place to check in on what’s going on with in the ATE community.

Pinterest – Are you overwhelmed by the amount of evaluation websites out there? Our Pinterest boards give you a shortcut to the best evaluation websites with ATE-relevant content. Check out our boards on Evaluation Organizations, Evaluation Journals, NSF-ATE evaluation, EvaluATE staff picks, and more.

Website – Of course, you are already here, but our all-new website features a streamlined interface, easier navigation to important content, enhanced search functionality, and a renovated resource library. Let us know what you think!

So come join us on all of our social media and content channels! We hope to see you soon!

The EvaluATE Team

Here is a small sample of upcoming blog posts

Figures at your fingertips: Making the case for formative evaluation
Preparing for the ATE PI Conference
Managing your evaluator
Evaluating impact: How I moved from pipeline to interstate

Newsletter: Tools to Prepare a Data Management Plan for NSF

Posted on July 1, 2014 by  in Newsletter - ()

EvaluATE Blog Editor

NSF requires that ALL proposals include a data management plan (DMP); FastLane will not accept submissions without one. The DMP must detail “how you will conform to NSF policy on the dissemination and sharing of research results.” The term “research results” basically means any information collected or produced as a result of your program. Therefore, the DMP must detail what data you will collect and how you will collect, maintain, report, and disseminate those data, as well as other resources generated by your grant. While NSF does outline requirements for what should be included in a DMP (bit.ly/dmp-ehr), they do not tell you how to write one. There are a handful of resources that can help you write a DMP.

The University of Wisconsin Research Data Services Unit has a webpage that provides several links to resources (http://researchdata.wisc.edu/), and the University of Michigan features extensive guidance, including templates and worksheets (bit.ly/um-dmp). The University of Minnesota also offers several resources for DMP development (bit.ly/umn-dmp).

One other tool that can be helpful is the DMP Tool available at DMPTool.org. You fill out the plan as you go through the tool, and you can save plans as well. The tool provides extensive guidance on DMP development, with instructions for each part of the plan, guidance on how to fill out the sections, and helpful links. ATE Central includes guidance, resources, and an example plan in their handbook, available at atecentral.net/handbook, and also provides archive services for resources produced by ATE projects and centers (which supports sustainability). A new requirement in the 2014 ATE program solicitation is that grantees “must provide copies of [their] resources to ATE Central for archiving purposes.”

If you can demonstrate that you followed the data management plan for a prior grant, and also that you have provided access to the information and resources that your project or center has generated, then you can even use this information in your Results of Prior Support section for your next proposal.

Newsletter: From ANCOVA to Z Scores

Posted on January 1, 2014 by  in Newsletter - ()

EvaluATE Blog Editor

The Evaluation Glossary App features more than 600 terms related to evaluation and assessment. Designed for both evaluators and those who work with evaluators, the app provides three ways to access the terms. The first way allows the user to browse alphabetically, like a dictionary. The second option is to view the terms by one of eight categories: 1) data analysis; 2) data collection; 3) ethics and guidelines; 4)evaluation design; 5) miscellaneous; 6)  program planning; 7) reporting and utilization; and 8) types of evaluation. The categories are a great starting point for users who are less familiar with evaluation lingo. The final option is a basic search function, which can be useful to anyone who needs a quick definition for an evaluation term. Each entry provides a citation for the definition’s source and crossreferences related terms in the glossary.

App author: Kylie Hutchinson of Community Solutions. Free for Android, iOS. Available wherever you purchase apps for your Android or Apple mobile device or from  communitysolutions.ca/web/evaluation-glossary/.