Java is, according to most ways of measuring such things, the most popular programming language in the world. In spite of this, there are many persistent myths about the language, even among those who use Java daily.
Since its debut 25 years ago, Java has been seen by some as something less than a "real" programming language. Whether because of Java's genesis as a language designed to program set-top boxes, or because Java's founders had the temerity to promise "write once, run anywhere" functionality for the language, there have been detractors from the beginning.
Yet Java has become popular both as a teaching language in universities and as a general-purpose programming language in many companies. Its popularity is why it's worth examining some of the myths surrounding Java. Whether a myth tends toward the positive or the negative, relying on magical thinking can leave you unprepared for situations in the real world.
[How can you help build the next generation of programmers? Read 9 Fun Tools for Teaching Kids to Code.]
Let's look at nine myths that can keep developers from making full use of Java.
Once you're finished reviewing these, I'd love to know what you think about Java as a programming language. Are there other myths that you've seen or heard about the language? Do you use it as one of your main languages for development?
Are you in the camp that sees Java as unfit for true enterprise programming? Let me know what you think about Java -- and the myths -- in our comments section below.