From tablets to Google Glasses, education technology has a colorful history. Take a look back and a peek forward.
New-fangled educational devices didn't start with computers or the Internet.
Writing, arguably the first education "technology," debuted at the end of the Stone Age, in the late 4th millennium BC. Although we don't know how cuneiform script was taught, we do know this pictographic system evolved during its 22 centuries of use, becoming more simplified and abstract until it was supplanted by the Phoenician alphabet around the 2nd century AD.
Writing's long history reveals a defining human characteristic: The constant search for new ways to capture and distribute knowledge. This search has provoked cultural, political and religious controversies along the way. Take, for example, a recent survey of professors at top research universities who prefer old-fashioned teaching methods such as whiteboards to using the Internet as a way to instruct. In this sense, today's debates about what should or should not be taught, or about the impact of this or that technology on students, echo earlier fights that stretch back centuries.
Another unsettled debate swirling in academia today: the educational impact of remotely connected students. In the past few years, millions of students around the world have taken massive open online courses (MOOCs) offered by established post-secondary institutions and commercial startups. As MOOCs have become more popularity, educators, parents and students have started asking legitimate questions about what is being gained and what is being lost when students never step foot on a campus or share a physical classroom with others.
Some believe technology -- specifically, data analytics -- will play an important role in answering these important questions. Projects such as the Predictive Analytics Reporting (PAR) Framework are collecting data across postsecondary institutions and studying educational outcomes, with the goal of revealing the best educational approaches.
Data capture and analysis also will be a key part of the Common Core State Standards initiative in the United States. The CCSS, which seeks to bring diverse state K-12 curricula into alignment, is expected to start formal assessment via computerized testing in the 2014–2015 school year.
Undoubtedly, some of these newest approaches will be transformational. Others will be discarded or evolve into something entirely new. Only time will tell.
Explore our slideshow to learn more about some of education's technological milestones through the ages, including the newest ones.
Johannes Gutenberg's printing press, a watershed technological advance in 1450, certainly helped accelerate literacy in Europe, and was a boon to the mission of the Catholic Church. But the press also enabled wider publication of books the Church deemed heretical during the Protestant Reformation. In 1559, Pope Paul IV ordered the first Index Librorum Prohibitorum, or Index of Prohibited Books. The index, issued 20 more times by different popes, was published for the last time in 1948.
The University of London was the first university to offer distance learning degrees, establishing its External Programme in 1858. This program, now known as the University of London International Programmes, collaborates with 12 colleges of the University of London and offers flexible and distance-learning degree programs worldwide.
The Scholastic Aptitude Test, founded in 1926 by the College Board, was first scored automatically in 1936 by an IBM 805 computer, which used electrical current to detect marks made by special pencils. But it wasn't until optical scanners, able to detect the marks of the ubiquitous No. 2 pencil, that standardized college entrance exams became a rite of passage for millions of college-bound juniors and seniors in the United States.
Although an early form of videoconferencing was demonstrated by AT&T Corp. at the 1964 World's Fair, higher education didn't start using this technology -- dramatically improved by hardware and software advances, as well as Internet Protocol standardization -- until around 2003, when schools around the world began to integrate video conferencing into their distance-learning programs.
The online virtual world Second Life, developed by Linden Lab and launched in 2003, might have failed in its goal of creating a general-purpose consumer platform, but it still holds a fascination for some educators. By one count, some 300 universities around the world teach courses or conduct research in SL. Adult English language instruction appears to be the leading educational application.
First released in April 2010, Apple's iPad has rapidly and substantially changed the way we use computers both inside and outside of schools. By some accounts, Apple's iPad sales to schools started taking over PCs in 2012. Not coincidentally, vendors of mobile device management (MDM) products are bringing their enterprise software tools to school administrators, who are eager for better ways to manage and secure and these devices.
The first massive open online course (MOOC) is believed to have been the 2008 course "Connectivism and Connective Knowledge," created by George Siemens, then an associate director of research and development with the Learning Technologies Centre at the University of Manitoba, and Stephen Downes, an online learning and new media designer and commentator.
The attention focused on MOOCs intensified in 2011 and 2012, thanks to a handful of commercial offerings such as Coursera, Udacity, Advance Learning Interactive Systems Online (ALISON) and others sealing alliances with traditional academic institutions, accompanied by a surge of venture capital investment into the sector. Harvard and MIT also made a splash with edX, a not-for-profit MOOC that currently offers courses from 12 universities.
Massive open online courses (MOOCs) provide immense amounts of data -- data that can be parsed, compared, merged, modeled and analyzed, with the goal of improving educational outcomes.
Proponents say these systems will be able to detect struggling students sooner than traditional means, and trigger a teacher intervention or even make an automatic change in a lesson plan.
One reason predictive and prescriptive analytics have become hot topics? Declining college graduation rates in the United States. Almost half of the students who begin college at a two- or four-year institution fail to earn a degree within six years.
Google's Project Glass, a research and development program to develop an augmented reality head-mounted display (HMD), is already attracting the attention of educators, well ahead of this product being commercialized.
One highly cited article on the potential impact of Google Glasses was written last fall by the staff of OnlineUniversities. Among other topics, the essay wonders how teachers -- who already deal with the downsides of other electronic gear in the classroom -- will keep students wearing such futuristic devices from becoming distracted.
InformationWeek Tech Digest August 03, 2015The networking industry agrees that software-defined networking is the way of the future. So where are all the deployments? We take a look at where SDN is being deployed and what's getting in the way of deployments.