Recently, Nigel Poulton asked me to be a guest on his Technical Deep Dive podcast, to which I graciously agreed (some of you may know this podcast by its old name, Infosmack). The premise of the podcast was for me to explain some of the technical points of Data ONTAP 8, including why it took so long for us to release it, why clustering matters, and for that matter why Data ONTAP matters. All in all I thought Nigel and co-host Rick Vanover did a fair job asking questions and assessing my responses - you'll have to listen to the Podcast for all the Q's andA's. I must admit however I was thrown off by a couple of Nigel's questions, in particular 1) Why have you stayed so long at NetApp and 2) How are decisions made there? Its not that I couldn't answer the questions, its just that I couldn't recall anyone ever asking them before. If the job of a podcast moderator is to dig into previously undisturbed soil, then Nigel and Rick did a great job as moderators.
In that light, I thought I'd talk a little more about my NetApp experience in this blog post. I joined NetApp 6 years ago as a product marketing manager (PMM). The main purpose of a PMM is to work with the product managers (PM) for the products you are responsible for, and figure out how to promote these goods so that people eventually buy them. The PM mostly works with engineers on things like roadmaps and bug fixes, while the PMM mostly works as the outside voice to the buying community. I was lucky to work with some really good PM's in those days, they invited me to their engineering meetings and patiently answered my never-ending stream of questions. This gave me great insight into the inner workings of both the technology and the company itself.
I was also lucky when I was given one of my first assignments - to promote NetApp deduplication, or A-SIS as it was known at the time. Back in 2006, no one really thought much about NetApp deduplication, including a lot of people working at NetApp. As I watched the other PMM's go about their business, I found that each had his or her own style and there seemed to be no recipe for a good PMM at NetApp. So I decided to create my own style, that of an educator, as a way to convince people that our deduplication was worthy of their attention, mostly by explaining how it works and thus demystifying NetApp deduplication. My effort started with a couple of whitepapers and some webcasts, which eventually lead to this blog and several published articles. I founded an industry consortium of deduplication vendors at the Storage Networking Industry Association (SNIA) and briefed countless perspective customers both at NetApp HQ and on the road.
The thing I appreciated most about NetApp at that time was that the people around me kept encouraging me to keep stretching and keep trying new ways to reach my audience, some of which we knew would work and some we knew wouldn't. Trouble is we didn't know which things would fall into which category until we actually tried them. I am grateful for the fact that I could be innovative in my marketing approach and no one ever said "we don't do things that way here." I am not sure every company would have given me the freedom that NetApp did, and I think this freedom still exists here for every PMM at NetApp.
Once dedupe became an inarguable success, in 2008 I was asked to focus my attention on NetApp Storage Efficiency. Working with many supportive PMM's and PM's, we developed a list of seven criteria that were required for true storage efficiency: Snapshots, RAID-DP, SATA Drives, Thin Provisioning, Thin Replication, Virtual Cloning, and of course Deduplication. Once we had this list, I fell back into my role of educator and produced material to help both internal and external folks understand how these features worked and why they were so important in reducing the storage footprint. And again I was allowed to extend the traditional boundaries of marketing, which this time included publishing the book "Evolution of the Storage Brain" - I wonder how many companies would have allowed me to partake in something so strenuous that I took me away from many of my normal day-to-day tasks? I can testify that NetApp offered some amazing support to me once again.
Now, in my role of Senior Technologist, I have a new and exciting challenge. With Data ONTAP 8, NetApp is once again changing the way people think about storage. An agile data infrastructure that is Intelligent, Immortal, and Infinite is within our grasp. Based on nine technical criteria, NetApp is the only storage provider with a singlular architecture that supports the most diverse set of workloads in IT history. The ultimate in storage virtualization, Data ONTAP 8 allows you to start small and grow to petabytes without disruption and without constantly requiring new skill sets. For this effort, I'll again take an non-traditional path in educating the value of agility while refreshed in the knowledge that NetApp will be supportive of this direction.
So...Over the next few months I'll be appearing on this blog educating you on each of the nine data agility points, and why they are so important in today's world of data-driven businesses. I'll also blog on many other topics, such as the important work currently being done at UC-San Diego to develop an Enterprise Data Taxonomy and Data Growth Index.