Currently Being Moderated

Bletchley Park.JPG

It’s 70 years today since the first electronic computer, Colossus, was used at Bletchley Park to crack messages sent by Hitler and his generals. It is thought these machines and their 2,400 valves shortened WW2 by up to two years! Still perhaps the most impressive impact of computer hardware in history. If you visit The National Museum of Computing, you can walk round all 5 tonnes of it. Colossus was kept Top Secret for 30 years because of the sensitivity of the work it did. Not surprising then it’s only the last 10-15 years or so that Colossus and Bletchley Park started to get the recognition they deserve.

 

Colossus Valves.pngI suspect not much attention was paid to the aesthetic appearance of Colossus. But as computers got smaller and more powerful in the years that followed, valves were replaced by transistors and then microchips. To the point where the interesting shiny hardware bits got hidden totally from view. Usually behind a cover known as a ‘bezel’ – the first of which was invented by Dr Alfred von Bezel in the late 1960’s*. Computer hardware has continued to become more powerful ever since, as Moore’s Law predicted.

 

Over recent years, innovation and differentiation in the computer systems industry has come mainly from Software. Hardware becoming more and more commoditised. There are only a few disk drive, memory, CPU manufacturers in the world. Most computer systems companies (at least the successful ones) understand that the value they add is packaging these components in the most optimum way, with unique software, through partners, to solve customer problems.

 

Much has been written on how ‘Software-Defined’ architectures change everything. As hardware components appear to become ever cheaper, many people are starting to question the value of integrated computer systems, or appliances – why not buy / build your own software and buy these commodity components direct from the manufacturers? Wouldn’t that save a lot of money? And add Agility? The utopia of the cheapest ‘dumb’ hardware, with everything and anything possible through software intelligence.

 

I recently bought a 2TB WD My Passport Ultra drive for £89. And it includes free ‘cloud back-up for my precious memories’. Not surprising then that people in organisations spending millions on data storage are looking seriously at what this could mean for the Enterprise. VMware (with vSAN, part of vSphere), Microsoft (Storage Spaces, part of Windrows Server) and several much smaller, VC backed companies have started to talk about the concept of replacing storage appliances entirely with software capabilities. With the assumption that you buy your own hardware. NetApp also has a product in this space – Data ONTAP Edge – which runs in a virtual machine on a standard enterprise server.

 

However tempting this may be, the realities are somewhat different. These software-only solutions are probably a good bet for small businesses or remote sites, but for core data centre use they lack a number of critical capabilities – ability to scale beyond a few 10’s of Terabytes, the necessary performance, storage efficiency features (workable deduplication, compression and the like), choice of support for multiple software stacks to avoid software vendor ‘lock-in’, and, most importantly, the ability to manage hardware availability efficiently. They also assume that IT departments have plenty of spare time to spend on integration, testing, scripting etc. And that, rather than storing intelligently, it is acceptable to store ever more copies of data. Most CIO’s I talk to want to spend less time building stuff, and more time delivering new services to the business. Less time & resources operating storage. More time exploiting Data.

 

In summary, software is about as much use as a chocolate teapot without the right hardware to run it on. No doubt innovation and differentiation is in software – NetApp spends 95% of our R&D efforts there. But disk drives have lots of moving parts and fail. Flash storage has no moving parts, but still fails, just in different ways. Companies like NetApp have 20+ years of experience building solutions to deal with this efficiently and at very large scale (a single NetApp customer now has more than 1,000,000 TB’s of storage to manage!). And we already outsource our hardware manufacturing, based in the main on ‘commodity’ components. Unique Data ONTAP & SANtricity software, bought and sold mainly as optimised FAS, E & EF-series systems built with 'commodity' components.

 

No doubt more choice is needed, a software-only solution will make sense for some people and those web-scale companies that can afford to build things their own way, but the vast majority will need packaged, price / performance optimised storage appliances for many years to come. The focus should be the value storage software can bring you on top of this, whether you run it in your data centre or buy a service from the Cloud. Uniquely, the best storage software can abstract pools of data from hardware and connect them across clouds to offer a whole new world of architecture possibilities for application owners. But you’re still going to need some hardware sitting in a data centre somewhere, probably with a shiny bezel or two, to run it on.

 

*This bit about Dr. Bezel is not true, but somehow the idea came up in conversation at the NetApp Coventry office Christmas lunch last year. I can’t remember why.

Comments

Filter Blog

By date:
By tag: