Archive: online courses

Blog: Shift to Remote Online Work: Assets to Consider

Posted on July 22, 2020 by  in Blog ()

Principal Partner, Education Design, INC

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m the principal partner of Education Design in Boston, focusing on STEM program evaluation. I first engaged in online instruction and design in 1994 with CU-SeeMe, a very early desktop videoconferencing app (without audio… that came in 1995!). While I’m certainly no expert in online learning, I’ve observed this newly accelerated shift toward virtual learning for several decades.

During 2020 we’ve seen nearly all of our personal and professional meetings converted to online interactions. In education this has been both challenging and illuminating. For decades, many in our field have planned and designed for the benefits online and digital learning might offer, often with predictive optimism. Clearly the future we anticipated is upon us.

Here, I want to identify some of the key assets and benefits of online and remote learning. I don’t intend to diminish the value of in-person human contact, but rather to help projects thrive in the current environment.

More Embrace than Rejection of Virtual

In nearly all our STEM learning projects, I’ve noticed far more embrace than rejection of virtual learning and socializing spaces.

In one project with partner colleges located in different states, online meetings and remote professional training were part of the original design. Funded in early 2020, the work has begun seamlessly, pandemic notwithstanding, owing to the colleges’ commitment to remote sharing and learning. These partners, leaders from a previous ATE project, will now become mentors for technical college partners, and that work will most likely be done remotely as well.

While forced to change approaches and learning modes, these partners haven’t just accepted remote interactions. Rather than focus on what is missing (site visits will not occur at this time), they’re actively seeking to understand the benefits and assets of connecting remotely.

“Your Zoom face is your presence”

Opportunities of the Online Context

  1. Videoconferencing presents some useful benefits: facial communication enables trust and human contact. Conversations flow more easily. Chat text boxes provide a platform for comments and freeform notes, and most platforms allow recording of sessions for later review. In larger meetings, group breakout functionality helps facilitate smaller sub-sessions.
  2. Online, sharing and retaining documents and artifacts becomes part of the conversation without depending on the in-person promise to “email it later.”
  3. There is an inherent scalability to online models, whether for instructional activities, such as complete courses or teaching examples, or for materials.
  4. It’s part of tomorrow’s landscape, pandemic or not. Online working, learning, and sharing has leapt forward out of necessity. It’s highly likely that when we return to a post-virus environment, many of the online shifts that have shown value and efficiency will remain in schools and the workforce, leading toward newer hybrid models. If you’re part of the development now, you’re better positioned for those changes.

Tip

As an evaluator, my single most helpful action has been to attend more meetings and events than originally planned, engaging with the team more, building the trust necessary to collect quality data. Your Zoom face is your presence.

Less Change than You’d Think

In most projects, re-calibration has been necessary, but you’d be surprised at how few changes may be required to continue your project work successfully in this new context simply through a change of perspective.

Blog: Surveying Learners in Open Online Technical Courses

Posted on July 22, 2015 by  in Blog ()

Assistant Professor, Engineering Education, Purdue University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I lead evaluation for nanoHUB-U, one of the educational opportunities provided through nanoHub.org. The concept of nanoHUB-U is to provide free access to interdisciplinary, highly technical topics of current significance in STEM research, particularly those related to nanotechnology. In fact, many of the recent nanoHUB-U course topics are so new that the information is not yet available in published textbooks. My job is to lead the effort to determine the merit of offering the courses and provide useable information to improve future courses. Sounds great, right? So what’s the problem?

Open online technical courses are similar to a broader group of learning opportunities, frequently referred to as MOOCs (massive open online courses). However, technical courses are not intended for a massive audience. How many people on the globe really want to learn about nanophotonic modeling? One of the major challenges for evaluation in open contexts is that anyone from around the world with Internet connection can access the course, whether or not they intend to “complete” the course, have language proficiency to understand the instructor, desire to reference materials or complete all course aspects, etc. In short: we know little about who is coming into the course and why.

To reach evaluative conclusions, evaluators must first begin with understanding stakeholders’ reasons for offering the course and a deep understanding of learners. Demographic questions must go well beyond the usual race/ethnicity and gender identity. For this blog, I’m focusing on survey aspects of open online technical course evaluation.

Practical Tips:

  1. Design the survey collaboratively with the course instructors. Instructors are experts in the technical content and will know what type of background information is necessary to be successful in the course (e.g., Have you ever taken a differential equations course?)
  2.  Design questions that target learners’ motivations, goals, and intentions for the course. Some examples include: How much time per week do you intend to spend on this course? How much of the course do you intend to complete? How concerned are you with grade outcomes? What do you hope to gain from this experience? Are you currently working full-time, part-time, full-time student, part-time student or unemployed?
  3. Embed the pre-survey in the first week’s course material. While not technically a true “pretest,” we have found this technique has resulted in a significantly higher response rate than the instructor emailing a link to the survey.
  4. Capture the outcomes of the group the course was designed for. The opinions of thousands may not be in alignment with who the stakeholders intended the course to serve. Design questions with a logic that targets the intended learner group by using deeper, open-ended questions (e.g., If this information has not been provided, where would you have learned this type of material?)
  5. Embed the post-survey in the last week’s course material. Again, our experience has been that this approach for surveys has generated a much higher response rate than emailing the course participants a link (even with multiple reminders). Most likely those that take the post-survey are the learners who participated in most aspects of the course.
  6. Use the survey results to identify groups of learners within the course. It is really useful to compare what learners’ intentions were in the course to what their behavior was, as well as their pre- and post-survey responses. When interpreting the results, it is important to examine responses based on groups of learners, rather than summing up the overall course averages.

Surveys are one aspect of the evaluation design for nanoHUB-U. Evaluation in an open educational context requires much more contextualization, and traditional educational evaluation metrics should be adapted to provide the most useful, action-oriented information possible.