Tim's Tales in NetApp BlogsWhere is place located?

Log in to ask questions, participate and connect in the NetApp community. Not a member? Join Now!

Recent Blog Posts

Refresh this widget

Attending Interop?  Join NetApp, Verizon, Oracle, and Cloudera at the Verizon panel: “The Enterprise Cloud Ecosystem” more details at the end of the post.

 

At the core of any technology service is “abstraction.” Wikipedia describes an abstraction layer as “a way of hiding the implementation details of a particular set of functionality.” It occurs at all levels of technology; for example virtualization is denoted as “hardware abstraction” where the hardware components are unbound and presented in a logical standardized form to the operating system and applications above.

JetBlue-new-01.jpg

 

One of the examples of an abstracted technology service occurs in the consumer world at 39,000 feet. JetBlue airlines’ LiveTV has enabled their passengers to “Enjoy the Journey” through offering live television directly to the passengers’ seat. JetBlue, living up to their promise to provide superior service in every aspect of their customer’s air travel experience have extended this to include high-speed Internet. When customers interact with and consume JetBlue’s LiveTV Media and Internet experience their abstracted view at 39,000 feet is the large screen touch display at their seat. For the passenger, it is a single view to a wide range of services, music, television, video-on-demand, and the Internet. This single, abstracted view is being delivered and presented via 100’s of servers located in different geographic locations.

 

This combined Order Processing and Content Management system is powered by Verizon Terremark Enterprise Cloud services. And I recommend reading more about JetBlue and Verizon collaboration on the Verizon Terremark blog.

 

In the enterprise IT world, when it comes to cloud, organizations have the distinct advantage: choice. They are presented with lots of options and cloud being the convergence of many different layers of abstraction is driving this availability of choice. The early 21st century IT model is shaping up to being a hybrid of procurement and brokerage services. IT organizations can procure their own, on premises private cloud, consume private cloud through a hosted service provider, or use the vast range of cloud services (StaaS, IaaS, PaaS, SaaS) through service providers and the hyperscalars. 

 

At NetApp, we feel confident we have a best of breed storage platform with Data ONTAP which uniquely addresses these challenges cloud presents. We announced our partnership with Verizon to bring more choices and unique value to customers. As enterprises increasingly look to combine public and on-premises cloud resources to optimize IT efficiency and business flexibility, Data ONTAP provides a universal data management platform to facilitate data and workload movement across multicloud environments.

 

To gain more insight on how we are extending our storage operating system Data ONTAP directly into cloud infrastructures, Phil Brotherton – NetApp’s Vice President of the Cloud Solutions Group – has a blog which discusses how to be strategic when picking hybrid cloud partners. The post covers the technology enabling portably enterprise storage capabilities in the cloud as well as the essential business models and ecosystems which have been put in place through years of engineering efforts and partner ecosystem relationships and development.

 

To learn more about the cloud, attend the Interop panel “The Enterprise Cloud Ecosystem” with executives from NetApp, Verizon, Oracle, and Cloudera. This session will discuss how the enterprise cloud is being built and the infrastructure and services that are required for the promise of the cloud to become a reality at enterprise scale. The panel will take place on Wednesday, April 2, 2014 – from 4 – 4:50 pm in Mandalay Bay Meeting Room I – Mandalay Bay Convention Center, Las Vegas, Nevada.

Cloud is more a business model shift than a technical shift.  By business, I’m not speaking of financials and the respective trickle down savings and efficiencies but instead a seismic shift in how IT services are consumed, paid for and sold. Part of the results of this impact can be seen in Wall Street and not through only the usual suspects of Server companies but, more interestingly, through the revenue patterns of any company selling software.


photo.JPG

To illustrate this I would like to reflect back to the proliferation of Server Virtualization from the 2000s.  The technical impact was obvious and necessary; due to Moore’s Law we found an oversupply of processing power and no application could be written to fully use that power.  Server Virtualization was an excellent tool to drive up utilization of capital assets such as Servers and with it came some incredible benefits.  Because of the level of abstraction from the hardware and the ability to affect change at the software level without the physical movement of hardware this gave superhero abilities to Server administrators. Along with higher utilization rates, making the bean counters happy, additional IT services became easier, faster and cheaper.  Disaster Recovery was suddenly available to all applications not just the ones running on multi-million dollar machine ecosystems.  The speed in which servers could be deployed went from weeks down to hours.  The most important factor is that Server virtualization did not significantly change the way infrastructure looked or was administered.  Similar to Virtual Tape Libraries, even though there were no actual tapes being written to from the Mediaserver the abstraction layer still provided the same look and feel of tape.

 

 

This is where Virtualization and Cloud diverge.  Virtualization, while amazingly transformative and something which Cloud is built upon, did not fundamentally change the way IT services are consumed, paid for or sold. 

 

Similar to how there was an underutilization of processor capacity in Servers, we are finding similar levels of underutilization in Applications.  Microsoft Office is a prime example of this.  Don’t get me wrong, it’s an amazing product but we should ask ourselves, ‘how much of the product we use vs. what we pay for?’. Word is equipped with advanced publishing tools suited for best selling authors and Excel is able to compute complicated financial and math problems something which a PhD in Geophysics could only come up with.  This divide of what we get and pay for and what we use is called the Consumption Gap.

 

Looking at Enterprise Applications, we are seeing a similar Consumption Gap.  There has always been a decent level of customization around Enterprise Applications covering Logistics, CRM or ERP however that customization came at a cost to the customer.  To be fair there are software-licensing tiers which cater to different business and budget needs however the customer is always burdened with the risk of project success not the vendor.

 

When deploying a new Application, in the traditional model, the customer will need to procure an infrastructure to host the application. Virtualization has done a lot to cushion the upfront and operational costs associated with this but the software license for the given Application and its auxiliary dependent services is an upfront cost and depreciated with a service and support contract providing additional cost over time.

 

What this creates is a model where the customer, not the vendor, is on the hook to realize the ROI for the given Application project. Whether the application is a success or not, the customer is holding the bag.  Let’s not forget about the consumption gap between the features being provided versus used.  This adds additional ROI strain.

 

Lets take an example of Nuance Technologies, a company taking voice-to-text solutions to the next level.  One of their solutions is positioned towards doctors who are freed from the burden of typing up patient notes by instead speaking the notes into a small hand held device which then translates the speech to text in the cloud and places it into the respective patient record system.  This is done through advanced speech recognition and self-learning artificial intelligence to better decipher the doctors unique word pronunciation. Nuance recently made the shift from a license software model to a subscription, pay as you go model. Initially, as you can imagine, this translated into a significant hit to their quarterly revenues because no longer were they receiving a 3 year upfront payment for a software licenses but instead a smaller payment occurring on a monthly basis.  I personally believe that this will benefit Nuance significantly in the long run however the shift in business models gets Wall Street on edge. This change is being negatively reflected in the short term stock price.

 

What Nuance is doing is where the industry will eventually go.  This is irrespective of whether the software is hosted by a Cloud Provider or is deployed internally on premises.  Public Cloud or hosted Cloud is not the default in this model, it is one of many options.  One thing which is the same, is the underlying business model will have shifted and with that the risk to the customer.  Software as a Service, is enabling customers and vendors to close the consumption gap while shifting the ROI risk away from the customer and towards a shared burden with the vendor. 

 

Business models are difficult to change and with that will come turbulence, there will be some significant winners and for the companies who don’t evolve there will be some losers. This does not mean that license models are dead but it will prove out that some significant form of innovation would need to take place if a consumption business model is not adopted.

twaldron

Bicycles of the Cloud in Tim's Tales

Posted by twaldron Oct 2, 2013

One of the most interesting stories, the late, Steve Jobs told was about how computers are the equivalent to a bicycle for the mind.  The story starts by describing an article he read, as a student, which charted the locomotion efficiency of the various species on earth; essentially measuring the energy consumed to move one kilometer.  At the top of the list was the Condor, with Humans coming in a 3rd way down the list. However, someone had the insight to measure the efficiency of a Human on a bicycle, which placed human locomotion efficiency at the top and well above the Condor.  The conclusion Steve made was that Humans are a species of tools and the computer is the most remarkable tool invented and is equivalent to a bicycle for our minds.

 


 

The computer today, also known as compute, has changed from being static and local towards virtual, distributed and shared.  Some compute happens locally on a client system, such as a laptop, tablet or smart phone, while other compute happens on either a dedicated server or shared compute in the cloud.  Often it’s not clear to the user where it’s happening and does not bare any relevance to the task on hand, as long as it gets done.

 

The retail consumer compute experience has evolved much faster in comparison to traditional Enterprise IT.  This experience shift poses a big challenge to Enterprise IT organizations around the globe.  People are becoming successful CIO’s of their own lives; self-procuring on-demand video, file/photo sharing and social collaboration services which span the globe while maintaining a distinct secure manor in how they consume and share their information. These services are ubiquitous and universally available and seemingly free for the masses.  This is not a reflection on the trailing performance of Enterprise IT but rather an illustration of the additional challenges and burdens they face yet need to provide services on parity to what’s openly available to the general public.

 

Whether you are a CIO, IT Manager, Value Added Reseller, System Integrator or Service Provider (essentially IT delivery) your role towards your customer or users has shifted in two profound ways.  The realization or expectation of the business value IT can and should deliver has grown while at the same time the availability of IT/compute services has grown exponentially in an environment where you were once the sole provider of such services.  Do you leverage the abundance of choice and availability or do you protect your single supplier status..?

 

The answer to this question is more difficult and complex than it first appears.  It is the type of question which mainline businesses are asking themselves each day; do I protect my core product, leaving an opportunity to my competitors to innovate and possibly win or do I embrace and innovate around change and ultimately cannibalizing my core product hoping for a greater result.

 

The essence of these choices comes down to how you deliver value to your customers.  Protecting your core product is a way of establishing or maintaining your competitive advantage but results in false, short-lived value for your customer.   This is the inherent risk in focusing on competitive advantage in lieu of overall customer value.   The open, collaborative, non-protectionist approach invariably wins.  We’ve witnessed this, countless times, at the IT infrastructure vendor level and now how IT services are being delivered and consumed.

 

Risks in Competitive Advantage copy.jpg

 

When talking with people in the IT delivery space, the challenge seems less around whether to embrace external compute services through the cloud or other means but how to do this and in a way which best serves their respective customers or organization. 

 

Organizations have varying appetites for IT availability, performance, security, compliance and governance.  These same factors influence an IT organization’s ability to adopt compute services such as external cloud services.  Not only are these factors different for each organization but also will change as the organization matures and grows or the business climate changes. 

 

The ideal solution to all of this is to allow the seamless consumption of Cloud Services irrespective of where or who is delivering that service.  The single critical and unique factor to any organization is its data. To use a personal example, if I switch from one type of smart phone to another, my overall smart phone experience is not altered provided that my data (Contacts, Mail, Calendar, Web links) remain intact.  In other words, everything around the operating system, processor, telecom provider is static however my data is dynamic and essential to the phones efficacy. The data payloads and the scale of an organizations IT environment is infinitely larger than a consumer smart phone.

 

Given that data is the single critical factor and rest is static, how does one adopt a multi sourced Cloud IT model while insuring stewardship around unique IT success factors such as availability, performance, security, compliance and governance?  Adding to this the ability to adopt or change cloud models which requires the movement of data.  All this needs to be done without IT getting in the way of business or increasing operational costs.  Data is growing at quicker pace than ever, as data payloads get larger the complexities and time it takes to move grow.

 

With computers being the bicycle for the mind, we are now we are finding ourselves in an environment where there is an impressive array of bicycle options, in the form of Cloud Compute Services.  Using these various bicycles, in an efficient, seamless, cost effective manor will gleam the most value from IT services today and in the future.

An area of innovation and creativity, which completely fascinates me, is The Internet of Things (IoT). Similar to Cloud, it has been talked about for almost a decade but now its time has come.  While it’s still early days, it is clear that three technology transformations are giving it strong tail winds:

 

  1. Ubiquitous Internet Connectivity, allowing for the ever-increasing number of WiFi devices to effectively communicate.
  2. Cloud Platforms and specifically for IoT, giving IoT developers the ability to focus on solely on their product, customers, go to market and not the backend infrastructure (Xively).
  3. Smart Phones and the Post-PC era, accelerating the declining price curve of compute and WiFi connectivity (Apple and Google)

 

The three companies, mentioned above, are my personal favorites, for driving change however I would be remiss to not give a larger picture of all the amazing and innovative players in this space, so I’ve included an Infographic, created by Matt Turck (@mattturck) and Sutian Dong (@sutiandong).

 

internetofthings2.jpg

 

“By 2020, more than three-quarters of the S&P 500 will be companies that we have not heard of yet.” -Yale Professor R. Foster

 

This quote, crystalizes the amount of innovation which will happen over the course of the next half-decade (yes half-decade).  It is incredible to witness this change before our eyes and it’s important to stop, every so often, and take note of what is now possible. 

 

A unifying aspect these technology transformations hold in common are their undying need for Non-disruptive Operations.  To use a very old cliché, “A chain is only as strong as its weakest link”.  To extend the cliché and analogy even further, in this case, we have a single horizontal chain and from that, we have an infinite number of vertical chains dangling from it.  With the horizontal chain being the backend infrastructure we start to see the overwhelming interdependence that exists with an exponential amount of end points affected even if there is just one outage.

 

As an early adopter of technology and due to my innate technical curiosity, I recently purchased the Philips hue personal wireless lighting starter kit. There are quite a large variety of features but essentially it is a network integrated light bulb(s) which can be controlled via my WiFi network and extends it’s connection through the Internet, linking to a secure Cloud dashboard.  The networked bulbs contain Philips famous color LED technology allowing one to create color moods to the various rooms in my home.   Harnessing the power of being connected to the Internet, it is aware of its geographic location, this makes it daylight aware.  Various settings can be set, such as the light automatically turning on only when it’s dark outside and during certain times. Using your Smart Phone with Geo-fencing you can set it to turn on, during dark hours upon you entering your home.  Adding further to the ‘geek factor’ and using IFTTT you can configure various triggers such as, each morning, set one of the bulbs to shine a color which is a reflection of what the weather will be like that day; Blue if it’s going to rain, Red if it’s going to be hot or white if it will be the same weather as the day before.

 

I've attached various pictures of my Philips hue unboxing as well as some screenshots of the Smart Phone App.

 

IMG_2784.jpg  IMG_2785.jpg

 

IMG_2786.jpg  IMG_2788.jpg  IMG_2789.jpg

 

While this technology is squarely positioned in the early adopter segment, one thing is clear, all the features and benefits are only possible if the backend infrastructure is working.  Before, my only dependency was electricity, now I have three: a Network, a Phone and the respective Cloud Infrastructure.  Without these three components being active and working, I’m in the dark.  Thankfully, Philips have designed a simple ‘reset’ where if you turn your power switch off then back on again, the light bulb will return to being a normal, boring ‘light bulb’ until either your computer browser, smartphone, tablet or your IFTTT recipe pushes an action. 

 

This is only looking at one application, however the IoT uses are endless.  Think about inexpensive IoT micro sensors being mixed into concrete. This would allow a builder to have a more accurate measure on the structure and strength of the overall structure. Imagine how this may accelerate the efficiency in building safety inspections; during the initial building process and perhaps the life of the building.  This type of solution would most certainly reverse the price commoditization trend of concrete, a long time member of ‘the commodities’.  Imagine, what other models could change due to "un-commoditization"..?

 

The level of interconnectivity and operability dependency, which we are witnessing in IoT, is the same as what exists in the compute and cloud compute echo systems of today.  In the past when compute was a single application server and client access was a PC or laptop, availability and uptime was best achieved through hardening the hardware at the server/infrastructure level.  Today, compute is delivered through a collection of many applications and services which come together in the form of the right information, at the right time and in the palm of my hand.  From an IT management point of view, the availability of each and every component at a synchronous point in time is essential, without this the service or application would not work. 

 

What is special about NetApp’s latest Storage Software announcement of Clustered Data ONTAP version 8.2 is how it brings Non-Disruptive Operations to modular, multiprotocol, truly unified storage for open systems.  This delivers the equivalent availability of the high-end monolithic storage frames of the past, adding the rich enterprise data management feature set which NetApp built over the past 21 years and with the ability to, on the fly, modularly add on and replace Storage Capacity and Compute with no disruption in data availability.

 

In a world where Data and Application interdependencies are ever increasing and the success of a given compute service is unknown, being able to modularly scale while providing Non-disruptive Operations across the board for all applications is transformative.


Amazon Web Services has recently concluded a number of their regional AWS Summits where NetApp was a proud sponsor.  In EMEA NetApp was a Silver Sponsor at both the Berlin and London Summits.  I had the privilege of attending both events and want to share some of the insights gained from this experience.

 

NetApp has an interesting angle in the Cloud arena.  Unlike some of our competitors, we are not trying to be a Cloud Service Provider but instead sell "with", "to" and "through" Cloud Service Providers and treating them more as a partner than an end customer. This model also extends through our Value Added Reseller community allowing for a diverse, adaptable and leveraged Cloud offering. An example of this is NetApp Private Storage for AWS solution and bi-directional partnership between NetApp and AWSTom Shields gives an excellent overview of how NetApp sees their role in the Cloud and details around the Private Storage for AWS solution in this video hosted by the CUBE at the San Francisco AWS Summit 2013.

 


 

Attending these summits, with me, was my manager, John Rollason – Director, Product, Solutions & Alliances Marketing EMEA.  As an interesting background to his overall impressions on Cloud after attending this event and other NetApp or NetApp sponsored events check out his latest blog post:  Cloud Computing in 2013: Your strategy, not a Product

 

The Keynote by Werner Vogels – VP & CTO at Amazon.com was eye opening. Through his presentation, I gained an insight to AWS that I not had before yet I always had a clear understanding of how AWS came into existence. 

 

Amazon.com, a successful Web Retailer –due to the holiday shopping season– saw a majority of their business fall in the months of November and December. They would need to build a compute infrastructure to handle the web retail traffic of this two-month peek but would leave a tremendous amount of excess compute to lay dormant for the remainder 10 months of the year.  A simple step to increase the value of these dormant assets was to ‘rent out’ the excess compute.

 

I’m not sure if this was an unbelievable coincidence or the plan of a genius but it created a Cloud Compute platform that would not only provide a tremendous push in the democratization of compute but change the face of IT forever.  The AWS model is the envy of Cloud Providers and IT organizations a like.  It is by no means the final frontier and there is plenty of room for new models, entrants and opportunities but for what they do and who the do it for, it is incredible.  I feel their genius is expressed accurately in this quote:

 

BIh2LbcCcAAughP.jpg-large.jpeg

“The most radical and transformative of inventions are those that empower others to unleash their creativity – to pursue their dreams”  Jeff Bezos, Letter to Shareholders, 2012.

 

One of the analogies Werner used to describe the leveraging power of AWS to developers was to compare it to switching on the lights at home and how we don’t think about how much it will cost.  The idea being that cost of or lack of compute resources to explore a creative idea or a business breakthrough should never be an obstacle.  With AWS they keep the hurdle rate of cost, time and access low thus allowing creativity and innovation to flow.

 

An important aspect to Cloud Computing is flexibility.  AWS has proven flexibility in their pricing, compute offerings and accessibility, however data mobility remains a challenge; this is not only the case for AWS.  Often the main method of data transfer in and out of Hyper-scale Clouds is through FedEx.  “Do not underestimate the bandwidth of a FedEx box” as Werner puts it.  As a side, there is an interesting blog post on this very topic by David Gingell in his blog Tangential Thoughts.  Essentially, the old fashion method of moving data still has a higher bandwidth than the Internet.

IMG_0129.jpg

 

It is with Data Mobility where NetApp Private Storage for AWS starts to make an impact.  This solution uses the world's #1 Replication Technology with the world's #1 Storage OS to provide developers a granular, fast and flexible Data Management solution between on premises Enterprise and AWS.  Not only does this provide rapid data mobility in a Hybrid-Cloud setting but also allows Enterprise IT to play a more crucial role and add value.


These are exciting times in the world of IT and Compute.  We are seeing tremendous innovations which bring tangible benefits to vast communities of people where accessibility to technology to realize our dreams and ideas are no longer a boundary.  I invite you to take a closer look at some of the innovative solutions coming out of NetApp and how we are enhancing, not trying to compete with, incredible Cloud Solutions available to the creative Universe.

 

You can read more about the AWS Summit 2013 - Navigating the Cloud series and get a closer look at NetApp Private Storage for Amazon Web Services information page.

 

Thank you for reading and look forward to your comments below.

 

More

Formatted Text

Categories

There are no categories.

Contact

Twitter Feed