Archive: critical thinking

Blog: An Evaluative Approach to Proposal Development*

Posted on June 27, 2019 by  in Blog - ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

A student came into my office to ask me a question. Soon after she launched into her query, I stopped her and said I wasn’t the right person to help because she was asking about a statistical method that I wasn’t up-to-date on. She said, “Oh, you’re a qualitative person?” And I answered, “Not really.” She left looking puzzled. The exchange left me pondering the vexing question, “What am I?” (Now imagine these words echoing off my office walls in a spooky voice for a couple of minutes.) After a few uncomfortable moments, I proudly concluded, “I am a critical thinker!”  

Yes, evaluators are trained specialists with an arsenal of tools, strategies, and approaches for data collection, analysis, and reporting. But critical thinking—evaluative thinking—is really what drives good evaluation. In fact, the very definition of critical thinking—“the mental process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion”2—describes the evaluation process to a T. Applying your critical, evaluative thinking skills in developing your funding proposal will go a long way toward ensuring your submission is competitive.

Make sure all the pieces of your proposal fit together like a snug puzzle. Your proposal needs both a clear statement of the need for your project and a description of the intended outcomes—make sure these match up. If you struggle with the outcome measurement aspect of your evaluation plan, go back to the rationale for your project. If you can observe a need or problem in your context, you should be able to observe the improvements as well.

Be logical. Develop a logic model to portray how your project will translate its resources into outcomes that address a need in your context. Sometimes simply putting things in a graphic format can reveal shortcomings in a project’s logical foundation (like when important outcomes can’t be tracked back to planned activities). The narrative description of your project’s goals, objectives, deliverables, and activities should match the logic model.

Be skeptical. Project planning and logic model development typically happen from an optimistic point of view. (“If we build it, they will come.”) When creating your work plan, step back from time to time and ask yourself and your colleagues, What obstacles might we face? What could really mess things up? Where are the opportunities for failure? And perhaps most important, ask, Is this really the best solution to the need we’re trying to address? Identify your plan’s weaknesses and build in safeguards against those threats. I’m all for an optimistic outlook, but proposal reviewers won’t be wearing rose-colored glasses when they critique your proposal and compare it with others written by smart people with great ideas, just like you. Be your own worst critic and your proposal will be stronger for it.

Evaluative thinking doesn’t replace specialized training in evaluation. But even the best evaluator and most rigorous evaluation plan cannot compensate for a disheveled, poorly crafted project plan. Give your proposal a competitive edge by applying your critical thinking skills and infusing an evaluative perspective throughout your project description.

* This blog is a reprint of an article from an EvaluATE newsletter published in summer 2015.

2 dictionary.com

Blog: Thinking Critically about Critical Thinking Assessment

Posted on October 31, 2017 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Vera Beletzan
Senior Special Advisor Essential Skills
Humber College
Paula Gouveia
Dean, School of Liberal Arts and Sciences
Humber College

Humber College, as part of a learning outcomes assessment consortium funded by the Higher Education Quality Council of Ontario (HEQCO), has developed an assessment tool to measure student gains in critical thinking (CT) as expressed through written communication (WC).

In Phase 1 of this project, a cross-disciplinary team of faculty and staff researched and developed a tool to assess students’ CT skills through written coursework. The tool was tested for usability by a variety of faculty and in a variety of learning contexts. Based on this pilot, we revised the tool to focus on two CT dimensions: comprehension and integration of writer’s ideas, within which are six variables: interpretation, analysis, evaluation, inference, explanation, and self-regulation.

In Phase 2, our key questions were:

  1. What is the validity and reliability of the assessment tool?
  2. Where do students experience greater levels of CT skill achievement?
  3. Are students making gains in learning CT skills over time?
  4. What is the usability and scalability of the tool?

To answer the first question, we examined the inter-rater reliability of the tool, as well as compared CTWC assessment scores with students’ final grades. We conducted a cross-sectional analysis by comparing diverse CT and WC learning experiences in different contexts, namely our mandatory semester I and II cross-college writing courses, where CTWC skills are taught explicitly and reinforced as course learning outcomes; vocationally-oriented courses in police foundations, where the skills are implicitly embedded as deemed essential by industry; and a critical thinking course in our general arts and sciences programs, where CT is taught as content knowledge.

We also performed a longitudinal analysis by assessing CTWC gains in a cohort of students across two semesters in their mandatory writing courses.

Overall, our tests showed positive results for reliability and validity. Our cross-sectional analysis showed the greatest CT gains in courses where the skill is explicitly taught. Our longitudinal analysis showed only modest gains, indicating that a two-semester span is insufficient for significant improvement to occur.

In terms of usability, faculty agreed that the revised tool was straightforward and easy to apply. However, there was less agreement on the tool’s meaningfulness to students, indicating that further research needs to include student feedback.

Lessons learned:

  • Build faculty buy-in at the outset and recognize workload issues
  • Ensure project team members are qualified
  • For scalability, align project with other institutional priorities

Recommendations:

  • Teach CT explicitly and consistently, as a skill, and over time
  • Strategically position courses where CT is taught explicitly throughout a program for maximum reinforcement
  • Assess and provide feedback on students’ skills at regular intervals
  • Implement faculty training to build a common understanding of the importance of essential skills and their assessment
  • For the tool to be meaningful, students must understand which skills are being assessed and why

Our project will inform Humber’s new Essential Skills Strategy, which includes the development of an institutional learning outcomes framework and assessment process.

A detailed report, including our assessment tool, will be available through HEQCO in the near future. For further information, please contact the authors: vera.beletzan@humber.ca  or paula.gouveia@humber.ca

Newsletter: An Evaluative Approach to Proposal Development

Posted on July 1, 2015 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

A student came into my office to ask me a question. Soon after she launched into her query, I stopped her and said I wasn’t the right person to help because she was asking about a statistical method that I wasn’t up-to-date on. She said, “Oh, you’re a qualitative person?” And I answered, “Not really.” She left looking puzzled. The exchange left me pondering the vexing question, “What am I?” (Now imagine these words echoing off my office walls in a spooky voice for a couple of minutes.) After a few uncomfortable moments, I proudly concluded, “I am a critical thinker!”

Yes, evaluators are trained specialists with an arsenal of tools, strategies, and approaches for data collection, analysis, and reporting. But critical thinking—evaluative thinking—is really what drives good evaluation. In fact, the very definition of critical thinking as “the mental process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion”1 describes the evaluation process to a T. Applying your critical, evaluative thinking skills in developing your funding proposal will go a long way toward ensuring your submission is competitive.

Make sure all the pieces of your proposal fit together like a snug puzzle. Your proposal needs both a clear statement of the need for your project and a description of the intended outcomes—make sure these match up. If you struggle with the outcome measurement aspect of your evaluation plan, go back to the rationale for your project. If you can observe a need or problem in your context, you should be able to observe the improvements as well. Show linkages between the need you intend to address, your activities and products, and expected outcomes.
Be logical. Develop a logic model to portray how your project will translate its resources into outcomes that address a need in your context. Sometimes simply putting things in a graphic format can reveal shortcomings in a project’s logical foundation (like when important outcomes can’t be tracked back to activities). The narrative description of your project’s goals, objectives, deliverables, and activities should match the logic model.

Be skeptical. Project planning and logic model development typically happen from an optimistic point of view. (“If we build it, they will come.”) While crafting your work plan, step back from time to time and ask yourself and your colleagues, what obstacles might we face? What could really mess things up? Where are the opportunities for failure? And perhaps most importantly, is this really the best solution to the need we’re trying to address? Identify your plan’s weaknesses and build in safeguards against those threats. I’m all for an optimistic outlook, but proposal reviewers won’t be wearing rose-colored glasses when they critique your proposal and compare it with others written by smart people with great ideas, just like you. Be your own worst critic and your proposal will be stronger for it.

Evaluative thinking doesn’t replace specialized training in evaluation. But even the best evaluator and most rigorous evaluation plan cannot compensate for a disheveled, poorly crafted project plan. Give your proposal a competitive edge by applying your critical thinking skills and infusing an evaluative perspective throughout your project description.

1 dictionary.com