Employers game the system and misrepresent the key market indicators.
Employers claim there is a severe shortage of IT workers in the United States. Listen in on any klatch of CIOs, and the conversation inevitably turns to their difficulties finding talent. Microsoft's Bill Gates, Intel's Craig Barrett, and other captains of tech industry argue that the situation has reached crisis proportions.
But moving beyond anecdotal impressions and vested interests, the employment and economic data paint another picture--one in which the IT labor market is clearing and none of the indicators demonstrates a systemic shortage. While exceptional talent or skills in emerging technologies will always, by definition, be in short supply, the most relevant market indicators--wages and employee risk--clearly show there's no broad-based scarcity of U.S. IT workers. In their zeal to enlist government help to expand the supply of tech workers through foreign guest worker programs, employers are misrepresenting IT labor market conditions.
A key indicator of tightness in any labor market is wages--more specifically, whether wages are rising much faster than the norm. IT worker wages grew by a modest 2.9% in constant dollar terms from 2003 to 2005, according to Department of Labor data compiled by the Commission on Professionals in Science & Technology (CPST). This increase is indeed greater than the average 0.6% growth for all professional occupations, but the gains for IT workers were hardly robust and don't indicate any significant scarcity. More recently, we've seen some growth in the wages for newly minted bachelor's degree computer scientists, according to the National Association for Colleges & Employers. Salaries for those entry-level jobs rose from $50,744 in 2006 to $53,051 in 2007, an increase of 4.5%. But those gains were almost completely gobbled up by inflation, which ran about 4.3% in 2007.
Another factor in considering the relative health of the IT job market is the level of risk employees face. As any investor will tell you, riskier investments should have the higher potential payoffs. It's no different with careers. While there are no formal measures of the risk and uncertainty of IT careers, it's obvious that they have soared over the past few years. The train wreck of 2002-2004 in the IT labor market derailed the careers of many professionals; some tech pros haven't come back.
Modest wage gains don't balance the rise in employee risk. ---Ron Hira, Rochester Institute of Technology, Economic Policy Institute
Meantime, employer norms have shifted radically. Long gone are the days when IBM never laid off a worker. Nowadays, companies don't think twice about shipping IT work overseas or bringing in lower-cost foreign workers to replace U.S. employees, and even asking American workers to train their replacements. Intel's Barrett writes an op-ed piece about the shortage of U.S. workers even when his own company is in the process of major layoffs, shedding 14% of its workforce over the past two years.
In addition, the risk of technological obsolescence and age discrimination are higher in IT relative to other professions. How many physicians or pharmacists become obsolete at age 40? Put in this context, it's hard to believe that the very modest wage gains of the past few years balance the increases in IT employee risk.
The consequences of this new equilibrium play out most prominently in career choices for those attending college. Enrollment of undergraduate computer science majors in major U.S. colleges and universities has plummeted an astounding 40% over the past four years, according to a survey by the Computing Research Association. Many blame a lack of interest in the tech field among young people, or our failing K-12 education system. But the most likely explanation is that students, using an array of information at their disposal, including advice from relatives in the field, have decided that IT isn't as attractive an option as it once was.
The Business of Going DigitalDigital business isn't about changing code; it's about changing what legacy sales, distribution, customer service, and product groups do in the new digital age. It's about bringing big data analytics, mobile, social, marketing automation, cloud computing, and the app economy together to launch new products and services. We're seeing new titles in this digital revolution, new responsibilities, new business models, and major shifts in technology spending.
What The Business Really Thinks Of IT: 3 Hard TruthsThey say perception is reality. If so, many in-house IT departments have reason to worry. InformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business views IT's performance in delivering services - and, more important, powering innovation. The news isn't great.