11 Programming Languages That Lost Their Mojo
Programming languages come and go. Here are 11 that have given up the spotlight to more modern options.
![](https://eu-images.contentstack.com/v3/assets/blt69509c9116440be8/bltc1a1ad5e522d7864/64cb4ff7ef4b2053a7075d80/Image-1.jpg?width=700&auto=webp&quality=80&disable=upscale)
If you've been in the IT world for more than about five years, then you've had a chance to see programming languages come and go. New languages leap into the market (Hello, Swift) and others slowly fade into the distance (MUMPS, we knew ye well.)
While most of the world is programming in one of a handful of languages -- such as C++, Java, and C# -- many of us have experience in other languages, as well. In order to fully understand the benefits and drawbacks of today's development tools, it can be useful to look back at the languages that have come before.
OK, let's admit it, it can be fun, too.
Let me begin by saying that the 11 languages listed here are a fairly arbitrary selection of the possibilities. There are more than 100 contenders, but I was looking for languages that had seen at least a level of popular acceptance and wide use for one purpose or another. I'm also not saying that there's anything inherently wrong with any of these languages, or the people who loved and used them. (In fairness, I'll heavily imply that there's something wrong with the people who loved one of these languages, but we'll deal with that when we get there.)
[ Not dead yeat? Read Fortran: 7 Reasons Why It’s Not Dead. ]
If you used one (or more) of these languages I'd like to hear from you in the comments. And if you're still using one (or more) of these languages I'd really like to hear from you. So let's start walking through the Land of Lost Languages (in alphabetical order) to see just who remembers what.
By the mid-1970s there were many different programming languages used for many different purposes. The US Department of Defense found this situation confusing and potentially dangerous, so they commissioned a new language -- one language to rule (and program) them all. The language was named after Ada Lovelace, the first computer programmer, and it really was a "do everything" language by design.
Wikipedia pretty much gets it right when it says, "Ada is a structured, statically typed, imperative, wide-spectrum, and object-oriented high-level computer programming language." The problem is, it was designed to do so much that it resulted in big, complex compilers for a great whopping huge language. The DoD mandated Ada's use in 1991 (though exceptions were frequently granted) and by 1997 the mandate had ended.
The end of the DoD mandate didn't really mean the end of Ada, though. Because it's very good at producing very reliable code (due to error-checking required in the compilers), you still find Ada in use for medical and some critical systems programming. It seems destined, though, to gradually fade away in favor of other, less ambitious languages.
When you start talking about ancient computer languages, it's tough to go back much farther than Algol. We're talking the trilobite of programming here. Algol ended up being on the scene at roughly the same time as FORTRAN and COBOL, but its primary legacy to programming is through the languages it spawned, rather than as a heavyweight in its own right.
If you learned to program using Pascal, or succumbed to the do-everything allure of PL/1, then you have a sense of the features Algol brought to programming. Among the linguistic features we owe to ALGOL are code blocks (usually set off by "begin/end" instructions) and nested code.
The ALGOL that everyone used as the basis for other languages was ALGOL 60. That gives you a good idea of when major development came to an end. If you remember programming in ALGOL, then you've achieved "OG" status in the software world. If you don't, then enjoy the branches on ALGOL's extensive family tree.
If you want to start a religious war among a bunch of programmers, lob APL into the discussion and walk away. Remember in the introduction, when I said that wouldn't be throwing shade at people who used any particular language? Welcome to the exception.
APL was designed to make it relatively simple to turn complex mathematics into a program. How math-oriented is the language? Let me put it this way: A three-dimensional array is a basic data unit.
In order to perform all of the mathematical gymnastics in a single line (or maybe two, if it's really complicated) APL calls upon a wide variety of symbols. The result is a block of code that is absolutely unintelligible to anyone not deeply into APL. The language's fans bristle at its description as a "write-only language," but it is in many ways one of the more difficult languages to understand.
You can still find APL programmers, generally in physics or math departments where very complicated things are being done.
So far, all the languages we've been talking about have come out of the commercial or scientific development worlds. Forth is different. Forth was developed as a language to be used for embedded control programming: Programming that results in applications that are small, efficient, and very reliable in terms of both outcome and execution time.
Forth is a stack-oriented language. If you think that the Reverse-Polish Notation HP used in its classic calculators is still the best way to get results from a handheld calculator, then you're 90% of the way to knowing how to program in Forth. If you pick up an HP calculator and wonder where the "=" sign went, then you'll have a steeper learning curve to climb.
Today, you're most likely to encounter Forth in the embedded world, or within a boot loader project that's part of an operating system build. Outside those areas, Forth is well down on the list of active languages and more remembered than used.
When early artificial intelligence researchers needed a programming language, they turned to a List Processor -- LISP. Developed in the late 1950s, LISP grew due to the strength of one of its basic data structures, the linked list. Linked lists turned out to be very effective mechanisms for dealing with multiple data types in a single structure. In some ways, LISP foreshadowed concepts that came to be used in big data, but in very small, compact systems.
LISP spawned a number of successors, with Scheme the most prominent. After years of steady decline, LISP has seen a resurgence of late as researchers look for ways to add intelligent features to smaller systems. Outside the AI world, LISP hasn't prospered -- and if you ever saw the debugging output from an IBM mainframe implementation of early LISP variants, you'd understand why. Without indents, the long list of "}" characters at the end of the listing can be several pages of despair.
Have you used GNU LISP or another of its open source variants? Are you still using one of the few major commercial systems on the market? Let us know your LISP experience -- with or without lots of brackets.
Have you ever programmed a turtle? If so, then you're probably familiar with Logo, a programming language designed to teach coding concepts. Starting in the late 1960s, Logo is formally a dialest of LISP. It has a somewhat confusing history because so many people love turtles, but "true" Logo was the entry point into programming for at least one generation of software developers.
Seymore Papert was the primary father of Logo. He originally wanted his language to teach people how to write LISP programs, but he realized that many people responded to a physical representation of the program they had written. When they could watch a "turtle" travel around a screen, they understood the impact of what they wrote.
The confusion came about because there were many languages that used "turtles" and called themselves something like "Logo" while having nothing to do with the real Logo language. As a result, this logic-oriented language is known by reputation as much as by syntax. If you can find an implementation, though, it's still a great way to teach programming concepts to the young. After all, who doesn't love being able to tell a turtle what to do?
Niklaus Wirth developed Pascal (more about that later) as a teaching language, but ultimately came to feel that it didn't have all the features a teaching language needed. Rather than endlessly extend and change the language, he developed a new set of languages, Modula and Modula-2.
The most significant and visible change brought by Modula and Modula-2 was the idea of the module, and block of code which can be more or less visible to the rest of the program. (It's controlled by the programmer.) This "scope limitation" allowed for code that could do all sorts of interesting things on its own, then return results to the main program without spilling all the messy details.
While Modula-2 had features that made it a better candidate than Pascal for commercial programming, it never developed the following of its older sibling. It's not gone, but it was almost forgotten before it was truly born.
Before Niklaus Wirth developed Modula and Modula-2, the Swiss computer scientist created Pascal. With Pascal, he brought into the world a language designed for teaching the concepts of structured programming. From the mid-1970s until the mid-1980s it did just that. Along the way, it also became the language that would play a major role in defining programming on the personal computer.
In the mid-1980s, if you were programming on an IBM PC there's a good chance you were either using Microsoft BASIC or Borland's Turbo Pascal. Turbo Pascal was a powerhouse, and different forms of Pascal were used for everything from building the software for the Apple Lisa to writing Donald Knut's TEX system.
As structured programming declined in favor of object-oriented programming, Pascal's fortunes declined as well. You can still find Pascal diehards, but like the French philosopher for whom it was named, Pascal's glory days seem solidly in the past.
In the 1960s, if you were writing business code you picked up COBOL. If science and engineering were your fields, then ALGOL and FORTAN were your tools. There wasn't a language that could "do it all." That is, until PL/I came along. (This is, by the way, read as "PL-One" -- the "I" is the Roman Numeral "1".)
IBM developed a set of goals for the language that involved improving on FORTRAN's numeric capabilities, and then added enhancements to COBOL's string and business-process abilities. The result was a big structured language that could be used for anything from control systems, to accounting, to scientific analysis. This was wrapped up in a compiler that became an early benchmark in the possibilities of code optimization.
You can still get a PL/I compiler, but the language never caught on the way IBM hoped. If you've got your heart set on programming in a huge, complex, procedural language, then you can choose between PL/I and Ada. If you find yourself doing this on any regular basis, please let me know in the comments section below.
IBM's mid-range business systems, from the IBM 1401 through the current IBM Power i Platform, have been called the most successful commercial computing systems ever. The vast majority of applications written on those machines came through a language developed to run efficiently using punch cards: RPG.
The Report Program Generator was a contemporary of FORTRAN and COBOL, and yet it remained stubbornly rooted to one vendor and one platform. Among the reasons it was so often used was its close link to the database available on the platforms and the "loop" -- the ability to take a block of code and apply it sequentially to every record in that database.
We're now up to RPG IV, and there are still folks out there writing RPG IV code every day. There was a time, though, when newspapers and industry publications were filled with ads seeking RPG programmers. That's no longer true. The same languages that have risen to the top on other platforms are present on Power i, and there are too many application-specific options to have a single, dominant language. I'd still love to know, though, if your professional life is in RPG. How modern do you think it has become?
Several of the languages on this list have been huge, do-everything languages of great complexity. Smalltalk is not among them. Smalltalk is the kind of language that results when computer scientists give themselves strict limits in which to work. Think of it as a the haiku of programming languages.
Smalltalk was created by a number of computer science superstars working at Xerox PARC in the late 1970s and early 1980. It had only six reserved keywords and operated on a message-passing model. This was a form of object-oriented coding in which actions were messages passed to variables and other entities. It was the mechanism through which many learned the basics of object-oriented programming.
When it came to creating commercial code, Smalltalk had a couple of major problems: It needed lots and lot of memory, and it didn't produce applications that were very fast. You'll still find it used as a teaching tool, but you have to look hard to find significant commercial use.
So that's my list of languages that, at the very least, aren't what they used to be. What do you think? Did I prematurely announce the end of a language? Are there other languages more deserving of a mention on this list? I look forward to your take on the languages that are well down the path to retirement.
Several of the languages on this list have been huge, do-everything languages of great complexity. Smalltalk is not among them. Smalltalk is the kind of language that results when computer scientists give themselves strict limits in which to work. Think of it as a the haiku of programming languages.
Smalltalk was created by a number of computer science superstars working at Xerox PARC in the late 1970s and early 1980. It had only six reserved keywords and operated on a message-passing model. This was a form of object-oriented coding in which actions were messages passed to variables and other entities. It was the mechanism through which many learned the basics of object-oriented programming.
When it came to creating commercial code, Smalltalk had a couple of major problems: It needed lots and lot of memory, and it didn't produce applications that were very fast. You'll still find it used as a teaching tool, but you have to look hard to find significant commercial use.
So that's my list of languages that, at the very least, aren't what they used to be. What do you think? Did I prematurely announce the end of a language? Are there other languages more deserving of a mention on this list? I look forward to your take on the languages that are well down the path to retirement.
-
About the Author(s)
You May Also Like