Last week was Gartner’s annual symposium and IT Expo. For those of you not in IT (information technology), this is a mutual admiration society of professionals in the field and their enterprise customers — the people who purchase technology research from Gartner. Although the IT market has been lousy, Gartner still retains its influence, and quite often it can make things happen simply by predicting them. (In fact, that’s one of the reasons its clients hire the firm to do research). In the past, the Gartner seal of approval has provided good butt cover for the CIO or CTO looking to buy some new gear or deploy expensive software.
In some cases, Gartner’s predictions are so obvious that one wonders who the audience is for them: perhaps there is a community of sleeping beauties out there who are unaware that over the next decade, Moore’s Law will continue to hold, bandwidth will become more cost effective than computing, there will be a further consolidation of vendors, and most organizations will reap the benefits of applications that stretch across companies. Certainly *we* all knew that already. Who’s we? The people who make the information technology stuff; the people who buy the stuff; the people who write about the stuff. If we don’t know, we deserve to bring products to market that no one needs, pay too much for those software licenses, and draw the ire of our readers. The technology community is still very small (and uninteresting to the larger world outside) and probably doesn’t really need Gartner to make predictions at all. And the outside world is largely unaware of Gartner’s predictions.
However, there are some interesting social consequences that fall, either intended or unintended, out of those predictions. One is that we will finally reap the full benefits of technology through collaboration. Collaboration will make for more efficient supply chains, better customer service, greater transparency into earnings, and everything else its evangelists promise.
While the gains in productivity will undoubtedly increase profits ‘ finally–the down side is that we won’t need all the workers we currently employ. Thus, alongside productivity improvements, Gartner sees a shrinking workforce. As in agriculture, technology will reach a point where system automation substantially lowers labor requirements. Gartner’s strategic planning assumption is that enterprises transformed by the Internet are 70 percent likely to have 10 percent few workers by 2005 and 60 percent likely to have 30 percent fewer workers by 2010. This has been predicted by futurists and fought by unions for the past fifty years, but it looks like it’s about to happen. IT workers will not be needed in the future at the levels they are needed today; they will have obsoleted themselves. I don’t think parents who are forcing their daughters to take computer science know this.
Following on that consequence comes the next one: there will be tremendous merger activity over the next few years, as customers collaborate, forcing their vendors to consolidate. The weakest companies in every space will have to sell out to their stronger competitors as the number of clients decreases and each potential client gets bigger.
A friend of mine has owned an online clearing house for merger and acquisitions activity for the past few years. During the glory days of the Internet, when everyone dreamed of an IPO, M&A was looked down upon. However, because most entrepreneurs in the future will probably exit through a sale or merger rather than through an IPO, and because collaboration and productivity gains will force mergers, it’s now probably a good idea to invest in a subscription to The Tech DealMaker at www.webmergers.com and begin to understand how M&A works before you actually get there. (While WebMergers’ site is primarily devoted to research and information, it has just spun off a sister company to work the actual deal flow, which is expected to accelerate again shortly. This is an area in which I hope personally to become more involved.)
The last Gartner prediction that interests me is the one about bandwidth being less costly than computing. This means we can develop peer-to-peer networks and share unused cycles instead of buying supercomputers. So will we all be on grids, and will our unused computing cycles be employed to find cures for deadly diseases? You wish. More likely, we will be trading first run movies and music in defiance of the current copyright laws.