A trick I use to predict the future is to identify ten-year-trends. These are really big trends that take place over a decade or more. At the moment, there are three ten-year-trends that interest me, but before I get into that, let me give some historical examples to clarify what I’m talking about.
In 1981, there were only an elite and nerdy few of us who had computers on our own desks. (I built my first computer in 1977, at age fourteen, but I am … different.) There were more people with some kind of remote terminal, but the majority of business people had no personal access to computers at all. Then, in 1982, IBM introduced the PC, and by 1992, ten years later, pretty much everybody in business had a computer on their desk. The ten-year-trend in IT during that decade was “put a computer in front of everybody’s face.”
During the 1990s, the ten-year-trend was “wire all of those computers into one big network.” At the beginning of the nineties, many computers weren’t networked at all, and networks that did exist were often small, connecting just a handful of computers in nearby offices. By 2000, the Internet was ubiquitous. IBM was running TV ads with nuns on camels in the middle of the desert sending e-mails to each other.
When I see lists of “this year’s top trends,” they often feel small and short-sighted to me. Any trend that only lasts a year is too insignificant to be important. It’s probably mostly over by the time you spot it. The really interesting trends, the ones that drive big change, last many years—a decade or more.
The three ten-year-trends that I see in IT today are: 1) Cloud/Outsourced Computing, 2) Server Virtualization, and 3) Flash Memory.
The big question behind cloud computing (or outsourced computing) is whether a company should build or expand its own data center, or whether it should access computing resources remotely, over the Internet. People are already using lots of cloud/outsourced computing today for things like blogging and web training, but I expect it to move up-market over time. This won’t happen overnight, but ten years from now I believe that many medium-sized businesses and a few large enterprises will have cloud-outsourced pretty much all of their computing infrastructure. (The term “cloud computing” is confusing because so many definitions are floating around. I’ll give mine in an up-coming blog.)
For people who do build data centers (either because they ignore the cloud/outsourcing trend or because they provide cloud computing to others), server virtualization will radically change how those data centers are built. Server virtualization is at the center of the trend, but it pulls a boatload of other trends along in its wake, including blade farms, network virtualization, and (my favorite) storage virtualization. Data centers built in ten years will look radically different than they look today.
Flash memory is amazing stuff. It is the first technology in decades with the potential to create a pervasive new layer in the storage hierarchy. Many technologies over the years, like bubble memory, have been touted as “disk-drive killers,” and none of them ever panned out, but flash memory is emerging as an enormous force, and there is no question that storage systems will look radically different in ten years. I’m not predicting that flash memory will eliminate disks any time soon, certainly not in five years and probably not in ten years, but I do think that it will relegate disks to an increasingly tape-like roll as flash absorbs more and more of the I/O intensive loads that disks handle today.
These are the three ten-year-trends that I see driving change in IT, but I’d love to hear from readers if they think I’ve missed some, or that the ones I’ve chosen won’t drive as much change as I think they will.