Despite various drawbacks, remote courses (both for credit and for pure learning) are becoming the emerging norm. Understand the options.
If I were obliged to choose industries that are susceptible to significant disruption in the next few years, I would have to point to education as being the most obvious and most important. In a generation and a half, education has gone from being an expense that most families bore manageably or with some difficulty to an extraordinary cost that can plunge students and their parents into deep, long-lasting debt. Rather than being the path of upward mobility that it was for generations, education has evolved into the principal barrier between the wealthy and the rest of us.
Education costs have risen far faster than inflation and can be accommodated mostly by parents who begin saving toward the expense the day their child is born. The current model cannot continue along its present trajectory. It is ripe for disruption, particularly in the programming field where developers are always partially self-taught, and demonstrated skill -- rather than coursework completion -- is the defining hiring criterion.
Several forward-looking universities have embraced this upcoming transition to greater self-education and begun making it possible to get college credit and degrees via remote study. Other universities, such as MIT and Stanford, have adopted open courseware in which they make it possible for students to audit classes via videos of the class sessions or, in some cases, watching in real time. And for some classes, university credit can be obtained for this remote participation (graded homework and exams are part of the experience, of course).
I believe this model of remote courses taken for credit and paid for at much-lower tuition rates will in the next decade emerge as the default way of getting a college education. The traditional four-year on-campus experience will be viewed as a singular luxury. Computer science is likely to lead the way in this transition because, more than most disciplines, it does not require face-to-face communication (as would a degree in music performance, for example).
Prior to joining Dr. Dobb's Journal, Andrew Binstock worked as a technology analyst, as well as a columnist for SD Times, a reviewer for InfoWorld, and the editor of UNIX Review. Before that, he was a senior manager at Price Waterhouse. He began his career in software ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Digital Transformation Myths & TruthsTransformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.