I was in an all-hands call, which as usual apparently for any company, was largely a waste of time and money. Anyways, as a prelude to hearing about some wonderful tooling/toolkits we should all standardize on, we see this chart:
The “chart” bothers me on so many levels. First I’m not sure why gartner feels that the average skill level of J2EE developers peaked in 2002. Why 2002? Is that when the smart, early adopters of J2EE started fleeing for the greener pastures of python, php and ruby? Or was the field swamped by “Sun Certified” J2EE folks who can memorize APIs but not actually solve problems? Secondly, by what measure are J2EE applications getting more complicated? Did the world get dramatically more complicated since 2002? If your architecture or platform (J2EE) is so complicated that you need to slather on more layers of “architecture” to fix it, isn’t there something fundamentally wrong in the first place? Complexity is not a virtue.
Lastly, as a developer thinking about skills that are applicable to another job down the pike, why would I want to learn an proprietary wizard-hand-held way of doing things when there are so many great open source solutions to the same problems that would be applicable at many more jobs down the road?
I agree, complexity is not a virtue, unless your contribution in the workplace is derived by the confusion you impart. However, simplicity has its perils as well see overly simplified chart above. I heard a few years ago that the complexity of J2EE application development would grow increasingly complex, but I do not really understand what that means. If it is true, why is it specific to J2EE, why not all application development? I can imagine someone purporting that all applications increase in complexity over time, but at the end of the day, specifically for J2EE, it is Java, right? Isn’t the complexity held in the hands of the architects and software engineers building these solutions?
I also think it is interesting that the average developer skill level drops at the same rate as it increased from 1999-2002. As far as I can tell the chart is saying is that there will be a huge gap between the people who are skilled (i.e. were part of the 2002 boom and continue to be talented developers) and the less skilled average folk. I mean, literally, the bottom half of the application development community falls below the average point on that line, but the highly skilled would continue level and at worse slightly degrade as decay sets in and younger piranha move in. Does that mean the highly skilled folks can look forward to getting paid more? On another note, if skill level drops, the North American based development community has no incentive to partake, it is the perfect level of work to be outsourced to the technically competent but developing nations (i.e. India, China, Vietnam, Brazil).
I do not see how the technologies at hand (IDEs, languages, frameworks etc) have any bearing on the general need for talented highly skilled architects and developers. Maybe the point is that the developer world is being flooded with less skilled, less aspiring individuals?! Would anyone argue against the idea that a developer that started with more complicated language is better positioned and often smarter about other simpler languages? I would take a Java coder turned Ruby expert over a Ruby only guru any day.