Massive Open Online Courses, or MOOCs, have grown fast. In less than ten years they’ve gone from not existing to having almost ten-thousand courses available. They have always been the subject of some controversy and have had their fair share of technical challenges, but the ever-increasing number of prestigious universities creating paths to actual degrees through MOOCs is a clear signal that the way the world learns is changing. There’s a lot of demand for more affordable, open, flexible education, and as online classroom technology improves, getting an online degree will increasingly be seen as a realistic alternative.
Where did they come from and where are they now?
Though the very first MOOCs started in 2008, the platforms that we know today (edX, Coursera, Udacity, FutureLearn, etc.) only came on the scene in 2012. Harvard and MIT were the driving forces behind edX, and Coursera came from two Stanford professors.
There was some initial pessimism because of high dropout rates, the ease of cheating, an uncertain business model, a lack of accreditation, and other perennial issues in online education, but it was outweighed by the number of students who were willing to give free online courses a shot. As of 2017, the MOOC platforms combined have:
- Eighty-one million students
- Over 800 universities
- 9,400 courses
- 500 MOOC-based credentials
And it’s gone way beyond the Ivy League. The Chinese-language XuetangX is the third-largest by enrollment, and there are other locally-organized MOOCs appearing from Thailand to Spain. Employers like Microsoft and IBM have started their own programs, and you can get anything from course credit to a master’s degree if you want to. What’s behind all this?
In the U.S, university costs have risen by 161% (adjusted for inflation) since 1987 and often require going into debt. That makes low-cost online education an attractive proposition for a generation of digital natives with uncertain employment prospects. Low-cost, high-volume MOOC-based degrees are popping up everywhere, especially in tech-heavy fields like computer science, analytics, and cybersecurity.
The vast majority of MOOCs don’t have any application process at all. You sign up for the course, pay for a certificate if you want to (it’s free just to audit), and if you do well in the course, you pass.
This means that people from all over the world, regardless of background, don’t have to jump through many formal hoops to get a credential. It also means that it’s easier to do things like explore career changes, learn recreationally, or just have access to some of the world’s smartest people.
Some universities are even using this as part of their admissions process: get good grades in their MicroMasters program, and you’ll have a much better chance of getting into their traditional program.
One reason that good universities are expensive and difficult to get into is that there’s a fairly large imbalance between demand and supply. Stanford only admits five percent of applicants out of the hundreds of thousands that apply each year, and Harvard and MIT have similar numbers, but their MOOC courses have reached millions. Stuffing more people into a finite amount of classroom space is difficult, but as the technology improves and the system is refined, there is no hard upper limit on the number of people who can have a productive experience in a MOOC.
MOOCs don’t require you to drop everything and start studying; you can be as part time or full time as you wish. Many of the courses are self-paced or have frequent start dates, allowing students to take breaks when they need to and customize their course load based on what they can handle. Providers also benefit from the platform’s flexibility in a different way: they can adjust their courses on the fly and improve it over multiple iterations or even update it as new ideas hit the market.
The bad stuff
Of course, like all technologies, MOOCs have a dark side. They’re impersonal, don’t foster bonds between students and teachers, lend themselves better to multiple-choice and mathematical answers than to projects and papers, have high dropout rates, and depending on the courses you take, your credentials may be more or less attractive to employers or future educators. It’s easy to learn and practice skills using MOOCs, but it’s harder to dive deep into difficult social issues and participate in a cohesive learning experience. That’s probably why the vast majority of credentials currently offered are oriented towards technical skills.
The Future of MOOCs
In 2012 MOOCs were a nice idea that mostly caught on with people who enjoyed learning things. In 2018 MOOCs can realistically help you get an education or change careers, though they’re still best for highly technical fields. In 2024 it would be surprising if MOOCs weren’t an even larger part of the educational landscape.
Will they kill the traditional university? Probably not. There’s still a noticeable benefit to in-person instruction and social learning that MOOCs haven’t been able to replicate yet. They are far better-positioned to experiment with new technologies and processes, though. AI could be used to create personalized learning tracks, virtual reality could help improve the social learning experience, blockchains could store educational credentials, et cetera. Regardless of how exactly they turn out, they will certainly be a much-needed injection of innovation into the relatively conservative education industry.
Image credit: MOOC Poster