Java is the COBOL of today.
It is a language often used in education (fortunately nowadays Python is taking over there), but this just means that we have lots of “Java Programmers” with zero experience. And the only thing worse than a bad programmer is a bad programmer who thinks (s)he is a good programmer.
Java wants to be object oriented, but in reality it is class oriented. Which is OK - if you accept it as such.
Object oriented programming was great: Imagine this: Keep the data structure and logic close to each other! And then use fancy things like inheritance to build more complex stuff. Which is a great idea, but the lack of multiple inheritance is a bit of an obstacle here. Although it stops the inexperienced programmers from shooting their own feet off, it also makes it difficult to re-use code in a intuitive way. Other languages had solved this problem ages ago, but Java would not listen.
After object orientation, we were told to be hyped up by Model-View-Controller: The Enterprise way of doing code (but the new buzzword was “beans”). Our beautiful objects were rudely cut into 3 parts, which always had trouble talking to each other, and was nearly impossible to design in a way which reduced maintenance. The amount of repetition was both infuriating and therapeutic. It kept a lot of people busy with the increased complexity.
We were told it would be a case of write-once-run-anywhere. Except that it did not quite work out that way. Running it in a browser quickly turned out to be the stuff of nightmares: Slow, Ugly and under crazy restrictions. Users could abort programmes by using the single most used button on the screen: The “Back” button, because we were trying to put an app in a browser where it was a bad fit. Even Adobe Flash was faster. Both ended up being the stuff of nightmares when it came to security. Good riddance to it.
Write-once-run-anywhere meant that Java had to re-invent everything, or work to the lowest common denominator. The JVM was essentially an operating system in its own right (I’m simplifying here). But Java could not make up its mind, and chose to do both. And then (for convenience) expose some of the operating system primitives which unwary programmers deftly used to ensure that the app could only run on one family of operating systems anyway. And if the programmers didn’t do it, then the build guys would make doubly sure by packaging the whole thing as a Windows .EXE file anyway. Attempting to run the same code everywhere was hard. So nobody did, whilst upper management still believed that concept. Or perhaps they didn’t care anyway.
As Java became popular (or took more mind share - not quite the same thing, especially when you consider you now had to be an expert to do stuff in the right way), we ended up with multiple versions of Java. Initially it was just Java JDK with some sensible version numbers (but not always backward compatible). Then Sun went mad with the version numbers: 1.5 was just “5”, 1.6 was just “6”. Version number inflation was the name of the game. But the versions were not 100% backward compatible and not 100% forward compatible - even if your app did not make use of the new features. So you had to direct your customers to install the “right” version of Java. Write-once-run-anywhere was now truly dead.
But all this is history, and only the old guys with beards reminisce about it. It is anecdotal.
To me, the worst thing Java did was to shape the way many developers were thinking.
They honestly thought programming was Java. They never heard of Pascal, Ada, Prolog, Lisp, Haskell or OCaml. If they read history, they might have heard of fantastic beasts like C, C++, Perl, Fortran or even COBOL but considered them scary and primitive (OK - COBOL was and remains awful). They never encountered assembly language or got anywhere near the basic hardware on which they ran, so they had no appreciation of “Mechanical Sympathy”. For them, Java was the machine, and could simply not understand how non-Java could could be faster, leaner or better.
Everything had to be a class. Factory methods were invented to patch over the places where abstractions could not cover. Many levels of abstractions were suddenly considered good - even when the added complexity added no new functionality and made maintenance a nightmare. Inheritance (as restricted as it was) was nice, but it was a leaky abstraction, so you had to make sure any exception you threw was wrapped into another layer of abstraction, lest the signature of the method would change leading to chaos. Adding a single field still meant changing things across all layers of the stack - unless you drunk the whole “inner platform” kool aid and added complexity there instead. You had so many layers of abstraction that you needed diagrams to keep track of them.
The code became an edifice in its own right - and far removed from the problem it was meant to solve. Even doing simple things required an impressive class hierarchy.
Basically: If your only tool was a hammer, you would end up treating everything like a nail. And these guys only knew Java. Or they believed in the existence of a “universal tool” - and believed it to be Java.
Guys: Don’t forget to take a step back sometimes. Lift your head off the keyboard (yes: I have noticed that many of you do not do 10-finger-typing. Barbarians). Try to look at things from above. Or at least a different angle. Question your own assumptions and opinions. You can always learn more.
… I feel better for that rant. Normal service may be resumed soon, but with a somewhat cynical and grumpy attitude.