The Evaluation-Center
Search:
Connect:LinkedinTwitterPinterestFacebook

Menu

Skip to content
  • About
    • People
    • Center Vita
    • EvaluATE 4.0
    • Evaluation
    • EvaluATE Partners
    • ATE Program
  • Library
    • One-Page Reports
    • Quick Reference Guides
    • Open-Access Evaluation Resources
    • NVC Handbook
    • Videos
      • Creating One-Page Reports
      • Evaluation Basics for Non-Evaluators
      • Evaluation: The Secret Sauce to Your ATE Proposal
      • R: For the Rest of Us
    • Getting Started with Your Evaluation
    • Proposal Development
      • Evaluation Plan Checklist for ATE Proposals
      • ATE Proposal Evaluation Plan Template
      • Finding and Selecting an Evaluator
      • Evaluator Biographical Sketch
      • Logic Models for ATE Projects & Centers
    • Evaluation Design
    • Data Collection & Analysis
      • Data Collection Planning Matrix
      • Excel Quick Data-Cleaning Tips
    • Reporting and Use
      • ATE Evaluation Report Repository
    • Conference Resources
      • ATE Principal Investigators Conference
      • American Evaluation Association Conference
      • FCRD Spring 2019 Conference
    • ATE Research and Evaluation
  • Webinars
  • Blog
  • Community
    • Slack Community
    • Webchats
    • Outstanding ATE Evaluation Award
  • Coaching
  • Survey
    • ATE Survey 2021
    • Prior Reports
    • Tips for Success from the Field

Subscribe here for quick access to our latest blog posts. New to RSS feeds? Click here

Blog: What do MOOC learners like and dislike?

Posted on March 7, 2018 by Gaurav Nanda, Kerrie Douglas in Blog
Douglas Photo Nanda Photo
Kerrie Douglas
Assistant Professor, Purdue University
Gaurav Nanda
Postdoctoral Research Assistant, Purdue University

Massive open online course (MOOC) learners vary widely in background and learning objectives. MOOCs are still trying to understand their learners better and provide content that is more suited to their needs. To understand the likes and dislikes of MOOC learners at a general level, we partnered with FutureLearn to study the responses to post-course survey questions from over 800 courses on its popular social learning platform. We analyzed open-ended questions that asked the learners about their most and least favorite parts of the course and ideas for course improvement.

We used Latent Dirichlet Allocation (LDA) topic model, a widely used statistical approach for exploratory analysis of large textual data, to identify underlying topics from the responses to each question. The LDA model represents each topic as a collection of closely related words and each document as a mixture of topics with associated weights. The topics identified were assessed qualitatively for coherence and relevance. Some of the main themes that emerged were:

Lecture videos: Many learners mentioned videos as their favorite part of the course. They said that they could understand the course content better and at a greater speed with videos because the information was presented to them by expert instructors in a clear and structured manner. Some learners suggested using real-life examples in lectures and providing subtitles in different languages.

 Social interaction: A lot of learners greatly enjoyed interacting with peers and instructors on discussion forums and wanted more avenues of interaction, such as live sessions and chat messengers. Some learners, however, did not enjoy online discussions because of a lack of time or the limitation of the medium, which they said led to misinterpretation of thoughts.

 Evaluations: Many learners wanted more quizzes and assignments, as these helped them to consolidate and retain information, and validate their understanding. While many learners were fine with peers evaluating their work, some would have preferred their work be evaluated by teaching assistants.

 Access to learning material: Easy access to learning material for the course and further study was highly valued by many learners. Some learners suggested uploading access-restricted journal articles and providing specific learning material.

 Time commitment: Many learners mentioned that they wanted a reasonable estimate of the time commitment at the beginning of the course, as they felt that they spent more than the expected time. They suggested that the syllabus clearly mention the prerequisites, and the list and depth of topics to be covered in the course so that they could decide and budget their time accordingly.

Lesson Learned

Overall, we gained meaningful insights about likes and dislikes of MOOC learners, which will help MOOC providers to design their courses better. We found topic modeling to be an efficient method for exploratory analysis of large and unseen textual data such as discussion forums and open-ended responses. However, it is important to note that the topics generated by the model may not always be interpretable. Therefore, qualitative assessment of topics is necessary. MALLET is an easy-to-use toolkit that can be used to start exploring topic modeling, with the help of this tutorial.

This work was made possible by the National Science Foundation (NSF) (PRIME #1544259). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of NSF.

 

Tagged: MOOC Feedback, social learning, topic modelling

Post navigation

← Blog: Overcoming Writer’s Block – Strategies for Writing Your NSF Annual Report Resource: Getting Started with Your Evaluation →

Browse by Author

Browse by Blog Category

  • Data…
  • Evaluation Design
  • Evaluation Management
  • Evaluation Use
  • General Issues
  • Proposal Development
  • Reporting

Recent Blog Posts

  • Blog: Building ATE Social Capital Through Evaluation Activities
  • Blog: Lessons Learned Moderating Virtual Focus Groups During a Pandemic
  • Blog: Creating an Evaluation Design That Allows for Flexibility
  • Home
  • Evaluator Info Update
  • Submit a Blog
  • Contact
  • Sitemap
The Evaluation CenterNational Science Foundation
EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.
EvaluATE •
1903 West Michigan Avenue, MS 5237,
Kalamazoo, MI 49008
• (269) 387-5920
Copyright © EvaluATE 2021
MENU
  • About
    • People
    • Center Vita
    • EvaluATE 4.0
    • Evaluation
    • EvaluATE Partners
    • ATE Program
  • Library
    • One-Page Reports
    • Quick Reference Guides
    • Open-Access Evaluation Resources
    • NVC Handbook
    • Videos
      • Creating One-Page Reports
      • Evaluation Basics for Non-Evaluators
      • Evaluation: The Secret Sauce to Your ATE Proposal
      • R: For the Rest of Us
    • Getting Started with Your Evaluation
    • Proposal Development
      • Evaluation Plan Checklist for ATE Proposals
      • ATE Proposal Evaluation Plan Template
      • Finding and Selecting an Evaluator
      • Evaluator Biographical Sketch
      • Logic Models for ATE Projects & Centers
    • Evaluation Design
    • Data Collection & Analysis
      • Data Collection Planning Matrix
      • Excel Quick Data-Cleaning Tips
    • Reporting and Use
      • ATE Evaluation Report Repository
    • Conference Resources
      • ATE Principal Investigators Conference
      • American Evaluation Association Conference
      • FCRD Spring 2019 Conference
    • ATE Research and Evaluation
  • Webinars
  • Blog
  • Community
    • Slack Community
    • Webchats
    • Outstanding ATE Evaluation Award
  • Coaching
  • Survey
    • ATE Survey 2021
    • Prior Reports
    • Tips for Success from the Field