Amy Gullickson

Senior Lecturer & Academic Coordinator, Centre for Program Evaluation, University of Melbourne

Amy Gullickson earned her PhD from Western Michigan University in 2010, where she did her research on ATE centers that had integrated evaluation into their daily practice. She now works as a Senior Lecturer and Academic Coordinator for the Centre for Program Evaluation at the University of Melbourne in Australia. She serves as Co-PI on the FAS4ATE project and enjoys being connected to the ATE community – even when it means webinars happen at 3 a.m. her time.


Blog: What’s Wrong with Problem Based Learning?

Posted on April 22, 2015 by  in Blog ()

Senior Lecturer & Academic Coordinator, Centre for Program Evaluation, University of Melbourne

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

The Formative Assessment Systems for ATE project (FAS4ATE) focuses on assessment practices that serve the ongoing evaluation needs of projects and centers. Determining these information needs and organizing data collection activities is a complex and demanding task, and we’ve used logic models as a way to map them out. Over the next five weeks, we offer a series of blog posts that provide examples and suggestions of how you can make formative assessment part of your ATE efforts. – Arlen Gullickson, PI, FAS4ATE

Week 1 – Why background research on your intended solution is important

In FAS4ATE, we’ve been working with logic models to identify points in the life of projects where evaluative thinking and practices can help staff get more of the outcomes they are trying to create with ATE funds. One of those places is between the needs the project is trying to meet and the solutions staff members have in mind to address them.

gullickson LM1 Pic

Problem Based Learning (PBL) curriculum and activities are a staple input of ATE projects; something ATE folks seem to universally agree meets the needs of student technicians. So you can imagine the shock, concern, and disbelief at the FAS4ATE workshop last October when Professor John Hattie, one of our keynote speakers from the University of Melbourne, challenged our group with the statement, “PBL doesn’t work, and in fact, it can be detrimental.” How could this be true?

The logic model can help, if we look at the fit between the learning needs of students and the kind of inputs chosen. Take PBL as an example. If students need to develop surface knowledge (i.e., learn, recall, and understand facts and ideas) PBL can actually have no effect or a detrimental effect. If students have that surface knowledge, but need to develop deep knowledge (e.g., apply concepts, compare facts and ideas, explain causes, develop skills), then PBL can have a huge positive effect on student learning (Hattie, 2009).

These conclusions are part of Hattie’s (2009) synthesis of more than 800 meta-analyses of studies on student academic achievement. Specific to PBL, he synthesized the findings from eight meta-analyses, which included 285 studies and 38,090 students. Six of the eight studies were conducted in postsecondary institutions.

So, PBL may be a great fit for the sorts of things students need to know and be able to do to be successful technicians. However, Hattie’s findings show the sequencing of PBL in a curriculum is critical – without that surface knowledge, a PBL approach could move students farther away from the impact an ATE project is trying to achieve.

No matter your role, this connection between needs and inputs is worthy of your attention. In a particular project, what has the choice of input been based on? If it’s simply because it’s something the staff have used successfully before, then it’s time to:

    • Ask questions about whether that same approach will work in the circumstances this project presents;
    • Dig into the literature for meta-analysis or other research that can help inform your choices (ask a librarian for help);
    • Check out ideas, research, resources and information at ATE Central (www.atecentral.net); and/or
    • Team up on an ATE research project to investigate what kinds of inputs and activities work best in what circumstances.

Additional resources:
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement London; New York: Routledge. See also http://visible-learning.org/ for more information about his books and research.
The Structure of Observed Learning Outcomes is a way to assess whether you are engaged in tasks that require surface or deep learning http://www.johnbiggs.com.au/academic/solo-taxonomy/.