NetApp 360 Blog in NetApp BlogsWhere is place located?

Log in to ask questions, participate and connect in the NetApp community. Not a member? Join Now!

Welcome to NetApp 360

NetApp 360 is your go-to resource for everything NetApp: what's happening at the company, the impact of our innovation for customers and partners, what makes NetApp a model company, and of course why our employees love working here.

Recent Blog Posts

Refresh this widget

Jessie McKay.jpgBy Jessie McKay, IT Program Manager, NetApp

 

Let’s face it: The role of IT is changing, as organizations evolve from being builders and operators to brokers of services for the overall business. “If you work in IT, you are a service provider,” said NetApp CTO Jay Kidd, referring to one of his top 10 technology predictions that will shape the IT industry in 2014.

 

Kidd’s sentiment was further expressed by NetApp CEO Tom Georgens who earlier this year in an interview with InformationWeek said, “So there's a transition [among] IT professionals from being effectively owner-operators to figuring out the role of external services, whether it's Software as a Service, traditional enterprise services, or hyper-scale services.”

 

At NetApp IT, the operating model is changing from a traditional plan-build-run to IT service management. For example, today IT provides nCloud, a pay-as-you-go, self-service cloud offering for non-critical business applications. For business-critical applications an automated Infrastructure as a Service provision model is used for on-premises solutions, making them on par with readily available cloud services.

 

With this new model, IT can deliver holistic compute environments that provide operating systems, user access, server monitoring, and applications storage within a few hours.

 

In managing our transition to a service provider we use a set of best practices based on a strong customer-focused approach called “IT service management” (ITSM) that enables us to:

  • Align IT services to the capabilities our business units need
  • Create a proactive plan to match both the business strategy and demands to ensure that IT consistently provides what truly is needed and valued
  • Define, develop, and negotiate service-level agreements with the business stakeholders to establish a mutual understanding of the service levels provided

    ITSM accountable slide_cropped_041714.jpg

 

We all know that IT operates best when it operates like a business, with a clear focus on customer needs and full transparency into its ability to meet those needs. IT service management will help us bridge the gap between customer expectations and IT deliverables – our best practices will grow over the years, putting the customer first.

 

Our journey toward ITSM is just beginning. Our goal is to become the service provider of choice for NetApp.

Cisco Partner Summit 2014 was held in Las Vegas in March and if you weren’t there, you didn’t hear the message of how important partners are to Cisco and how Cisco continues to commit to channel success. Between now and April 28, you can read the summary of major announcements and offers in Partner Central as well as access replays of live presentations.

 

Rick Snyder and Thomas Stanley Share Some Facts

During the FlexPod Premium Partner Auxiliary Session, Rick Snyder, Vice President of Global & Strategic Partner Organization at Cisco Systems was joined by Thomas Stanley, Senior Vice President of Global Partner Sales & Alliances at NetApp along with Jim McHugh, Adam Fore, Satinder Sethi, Dan Neault, Brian Allison and Janet Chang-Pryor to share where we are going and how the numbers stack up today: when it comes to converged infrastructure, FlexPod continues to gallop forward at an all-out pace.

 

FlexPod.pngRapid Adoption

We learned that FlexPod is still #1 in revenues year-to-date for worldwide Integrated Infrastructure for 2013, rapidly scaling its customer base with 107 percent growth year-over-year. An install base of more than 3,200 customers and more than 100 public customer references means a FlexPod is sold somewhere in the world roughly every 4 hours. We also learned that FlexPod is a cost efficiency leader, with VCE costing customers $1.3M per unit while FlexPod averages an affordable $327k ASP, a remarkable 75% savings and something customers are getting excited about.

 

Proven ROI and Guaranteed Savings

In parallel, proven results from leading research firm Forrester show that FlexPod customers realize 120 percent ROI, and see a break-even in just 9 months, while FlexPod is still the only converged infrastructure solution that guarantees savings: NetApp storage systems guarantee customers will require 50 percent less storage capacity vs. competitive virtualized solutions.

 

FlexPod Offers Choice

With FlexPod, customers aren’t locked into a certain hypervisor or software strategy either. We offer the broadest selection of validated hypervisor-based and bare-metal designs, including extensive integrated testing with VMware vSphere, Microsoft Hyper-V, Citrix XenServer, RedHat KVM, and Oracle VM. And our diverse ecosystem of leading technology partners, with joint validations and/or go-to-market programs include Microsoft, SAP, Oracle, VMware, Cisco, Citrix, Cloudera, Hortonworks and Red Hat.

 

FlexPod is Application Centric

But speaking of our technology partnerships, it is applications that are driving adoption of FlexPod. With pure-play core infrastructure purchasing showing little to no growth, it is critical to know who is going to be spending the money and why? Approximately 80% of the buying decisions will be made around deploying either a Microsoft or Oracle workload with Microsoft applications driving more than 50% of that buying decision for the initial PO.

 

Understanding how and what applications are driving buying opens the door for new dialogues with your customers and prospects. For some additional information on reaching the newest decision-makers, check out our earlier blog where Pat Bodin suggests taking another look at the new IT buyer.

 

Sales Enablement Buzz – Guided Solution Sizing

There are several new pre-sales tools to help shorten the selling process. All of these resources are available at our www.cisconetapp.com partner portal. And don’t worry if you don’t have a secure log-in, simply follow the easy access request process and you will receive your credentials. Onboarding for this terrific resource occurs twice a month.

 

One of the tools that got the room buzzing was “Guided Solution Sizing”. Simply plugging in basic customer requirements and assumptions delivers a recommended FlexPod configuration that includes specific UCS server, Nexus switch and FAS controller. This FlexPod BOM can then be used to generate a budgetary quote for the customer. GSS supports Microsoft Exchange, SQL Server and SharePoint, as well as VMware View today: expect to see more workloads and applications being introduced over the next several months to help partners spend less time winning more business.

 

Paying for It All – Financing Increases Deal Size

As part of our continued commitment to make it easy to make a FlexPod decision, we have expanded our unified approach to financing by covering more geographical regions and simplifying the ability to add-on, upgrade and refresh. Payback periods are shorter, ROI is improved and better options for the SMB market make our financing terms more attractive for this community.

 

Screen Shot 2014-04-22 at 10.00.37 AM.pngPartners further benefit with an additional 1% incentive on both Cisco and NetApp.

 

Bain and Company were brought in to help us understand how financing actually helps close business. Their findings illustrated that when financing is part of the deal, the deal size increases on average from 20-35% because customers are focusing on a monthly expense. Any incremental amounts produce only a small increase in their monthly payment. Result? Customers are actually happier spending more because they are able to get exactly what they need, when they need it, without compromise.

 

Need one more reason to think about financing? When financing is brought in to the deal early on, the win ratio increases anywhere from 3% to 12% while discounting is reduced. The numbers tell the story. Financing sweetens the sale.

 

In Summary

At Cisco Connect, NetApp and Cisco were able to spend time sharing why FlexPod continues to reign supreme as one of the best business bets our two companies have made in the last 3 years. For partners, our on-going commitment to FlexPod will result in more sales as we help you go to market with a solution that gives you the ability to reach more broadly across your customer and prospect base. Our pre-sales tools, validated reference architectures and financing solutions make it tough for customers to say “no” to FlexPod.

 

If you want to understand more, read some of the exciting wins that have realized this past year and tell us about your own FlexPod success for a chance to be featured in a case study or a blog. Simply send an email with the subject line “FlexPod Win” to Michael.Dilio@netapp.com.

We are just three weeks away from Microsoft TechEd 2014 – kicking off May 12th in Houston, Texas – and we at NetApp, Microsoft’s 2013 Server Platform Partner of the Year, couldn’t be more excited about what we have planned. From booth demos and giveaways to speaking sessions and genius bar, there’s no shortage of activities. It might even be a little overwhelming for those of you attending, which is why we’ve pulled together a handy guide to all things NetApp at Microsoft TechEd.


MSFT Partner Logo.jpg

Visit Us at Booth #600

Join us at the NetApp booth throughout the week for a variety of activities including mini-theater sessions, demos, genius bar meetings, giveaways and more. Find out more about how NetApp is able to deliver award-winning solutions for your Microsoft Windows Server, private cloud and applications infrastructure with nondisruptive operations, proven efficiency and seamless scalability.

 

Stop by the NetApp booth Monday or Tuesday to get a ticket to our party at St. Arnold's Brewery – Texas' oldest craft brewery – on Tuesday, May 13th from 4:30-7:30pm. Transportation will be provided. Attendance is first come, first serve so get your tickets early!

 

Drop In for a One-on-One Technical Meeting at Our Genius Bar

We will have experts available to answer your tough questions on these topics and more:

  • How can I increase automation with Windows PowerShell®?

  • How can I shift VMware® ESx® VMs to Windows Server® Hyper-V® while minimizing downtime?

  • How can I integrate NetApp with Microsoft’s private cloud?

  • How can FlexPod® accelerate data center workload deployment?

  • How can I get the most out of my Microsoft business-critical applications such as SQL Server® , Exchange®, and SharePoint®?

 

Attend One of Our In Booth Presentations and Enter to Win a $50 Gift Card After Each Session

  • NetApp Private Storage for the Public Cloud

  • NetApp Integration with Microsoft Private Cloud

  • NetApp Integration with Microsoft System Center

  • Leveraging Flash Technologies for Increased SQL Server performance

 

That’s not all! Stop by our booth and enter our daily raffle for a chance to win one of four Xbox One consoles. You do not need to be present to win.

 

Get Social with NetApp at Microsoft TechEd 2014 using #NetAppTechEd

  Follow us:

FB.pngTwitter.pngLI.pngYT.pngSlideshare.pngCommunities.png

 

Unbound Cloud - Lady.png

NetApp Breakout Session

Tuesday, May 13, 10:15am – 11:30am

Room: 381A

Maximize the Benefits of Microsoft Cloud: Security, Automation, Self-service and Provisioning (DCIM-B296)

Glenn Sizemore, Reference Architect, NetApp

 

Learn how you can make the most of your Microsoft Cloud environment from NetApp, Microsoft’s Server Platform and Private Cloud Partner of the Year. Deep dive into NetApp’s native integration with Windows Server 2012 R2, PowerShell, and System Center to provide monitoring, management, self-service, and automation. Learn how you can securely move data and extend virtualization and on premise cloud with the public cloud to gain greater scale and elasticity in the services you provide to customers.

 

Theater Presentation at the Digital Wall

Monday, May 12, 11:45am

Migrate your vSphere infrastructure to Hyper-V over the weekend. Seriously.

Glenn Sizemore, Reference Architect, NetApp

 

Windows Server 2012 R2 has certainly brought Hyper-V into the big leagues.  So much in fact that most organizations have either already completed, or are in the process of running a Hyper-V proof of concept.  Unfortunately, the prospect of migrating their current vSphere production environment is such a daunting task that it seems out of reach… until now.  The Microsoft Automation Toolkit Powered by NetApp Project Shift is changing that reality in a major way.  MAT4Shift enables an organization to not only rapidly migrate from vSphere to Hyper-V, but do so in as little as 15 minutes regardless of the size of the VM.  In this presentation learn how it works, and how you really could migrate to hyper-v next weekend! 

 

Of course, that’s not all we have planned for the week. Keep an eye here on NetApp 360 and follow #NetAppTechEd on Twitter to make sure you don’t miss a thing.

 

For more in on NetApp and Microsoft Solutions, visit www.netapp.com/microsoftsolutions and www.netapp.com/microsoftcloud.


We’ll see you in Houston!

NA_Globe_Lifted_HiRes_RGB.jpg

By Brad Nisbet, Marketing Manager, Cloud Solutions, NetApp


As the pace of business increases, competitive advantage will increasingly hinge on the ability to access and strategically harness information.

 

The innovations and advances in today’s technology are happening so quickly and so consistently, it is almost like being on an airplane. During the flight, it feels like you are sitting still, even though you are moving more than 600 miles per hour.

 

Only a few years ago, the cloud was most closely associated with “storage.” Organizations that were suddenly able to gather, retain and generate massive amounts of data, with and on more devices and platforms than ever before, began to wrestle with the basic challenge of where to put it all. For many, the answer was a new and almost mythical place with a fairy tale name: “The Cloud.”

 

At least in the public consciousness, the concept of the cloud appeared so quickly within the dizzying explosion of data usage that many people’s perception of what it is and what it does remains, no pun intended, quite cloudy. Shaped by personal computing associations and terminology (“backup to iCloud”), the general comprehension of it can be paradoxically simplistic and vague.

 

It is easy to see why the complexities of cloud deployment can get overlooked. But ignoring these complexities can lead to significant risks. Many organizations that charge into cloud deployment haphazardly quickly find themselves mired in costly innovation-stiflers like vendor lock-in and disparate cloud architectures. Worse yet, they place their mission-critical data at risk or out of reach. Some lose it entirely.

 

With a basic understanding and some thoughtful planning, though, all this can be completely avoided. Because what the cloud is, is quite simple. But what it does, when used correctly and strategically, is actually quite profound. And it is fundamentally reshaping business intelligence, removing blocks to innovation, and transforming data’s role in success.

 

First, the simple.

 

When people discuss putting something in “the cloud,” all it means is that the information will reside outside an organization’s firewall, often in space rented on-demand from large scale utilities like Google, Amazon or Azure. That’s it.

 

Having data in the cloud means it is globally accessible and flexible, with extreme ease of access and scalability. In the past, organizations would have to buy, build, house and maintain a data center. It was a labor-intensive, painstaking and costly endeavor. Today, a rental contract with a cloud service provider can be signed quickly, offering instant global reach and accessibility.

 

For smaller organizations and startups, it is often an easy decision that has leveled the playing field and offered unprecedented access to hungry, young innovators with software solutions. Many larger enterprise organizations are realizing the same advantages by migrating their data into the cloud – often, into a hybrid cloud architecture that combines on- and off-premises deployment.

 

This small but critical distinction – the ability to seamlessly house and access data between multiple cloud resources in a hybrid configuration – has had enormous consequences. It has transformed data into a new kind of global currency and immeasurably increased the importance of IT to business success today.

 

The ability to instantly access mission-critical data creates both opportunity and risk in a cloud deployment. How efficiently and effectively an organization is able to tap, parse and derive intelligence from data is now more critical than ever before. By logical extension, an organization’s ability to restrict access and protect that data is equally important.

 

Why?

 

Because data, now a fundamental driver of business success and competitive advantage, has become extremely valuable. It has moved far beyond “storage,” with all its associations of a dusty warehouse filled with old files. In the relative blink of an eye, technology innovation that allows for incredible insight to be derived from information has harnessed data into a valuable corporate asset that needs to be retrieved efficiently and guarded carefully.

 

BigData_Content_2_HiRes.jpgSimply put, data is now money.

 

But ironically, given the speed and dexterity of technology today, the money that data most closely resembles is not computerized banking or split-second financial industry trading. Data is more like cold hard cash. It is a physical thing – heavy, cumbersome and hard to move. Large data transfers can take days, even weeks. You don’t want to move it very often and you don’t want to move it very far.

 

Now more than ever, data needs to be carefully managed across different “boundaries,” without losing control. Data is extremely valuable and personal, but it is also extremely cumbersome.

 

In order to best manage data’s value, some organizations are moving away from being the builders and maintainers of data centers themselves, becoming brokers of services that enhance business performance. As a result, there are now many new service providers and vendors competing to meet these needs.

 

For successful strategic cloud deployments, organizations must be able to quickly assess and assign value to the different types of data they generate and collect – and draw from that analysis to designate a location for that data.

 

They then must assess vendors to determine which should be approved, and for what kind of information. Vendors are evaluated across a variety of criteria, including the country they operate in, their corporate history, their security protocols and even their proximity. If they meet all requirements, they are approved for use by the company’s employees.

 

This is the essence of what is now called “data stewardship.” You can offload the infrastructure, the applications, the services, but you can never offload or outsource “control” and responsibilities of important data.

 

It is ironic that all this important and game-changing management of information is all unfolding under the warm and fuzzy term “the cloud.” It’s true that what is essentially a simple concept – let’s take the data we have behind our firewall and move it to a place where it is more accessible, flexible and scalable – can get complicated quickly. Within these many levels of complexity, though, are incredible new strategic advantages waiting to be unleashed. Smart, conscientious cloud deployment can drive greater value to and from a company’s information, while tapping the incredible power of innovation to drive real impact.

 

The sooner organizations realize what the cloud is – and what it is not – the more effectively they will be able to use the power of information to their advantage. As the fog of the cloud continues to clear, its complications are simplified and its incredible benefits are revealed.

By Peter Corbett, Vice President & Chief Architect at NetApp

 

The concept of flexible computing in the cloud needs no introduction here. Many businesses are being built from the ground up on cloud infrastructure, both for production IT and internal IT operations.  But the vast majority of large enterprises and organizations still run much of their IT operations internally.   I doubt that any of these organizations haven’t given some thought to how they could leverage the cloud for at least a portion of their IT needs.  But, an IT shop looking at moving some applications to the cloud faces several challenges.  Among these are several directly related to data storage and transfer.

 

Of course cloud providers offer storage to go along with the flexible compute capabilities they prominently feature.  They have developed pricing models that account for ingest, access and retention of data.  They offer different classes of service in terms of durability (the certainty that the data you deposit will not be lost), availability (the ability to access data at any time), performance (the latency and throughput at which data can be stored or retrieved), the storage access protocols and semantics supported, and other service level attributes.  Organizations with internal IT footprints face the same set of decisions about service levels, but these attributes historically have not been uniformly articulated and quantified across different storage system manufacturers. It is clear that the cloud providers are driven to optimize and refine their pay-as-you-go pricing models, and this has led to a more defined articulation of the service levels provided by their different classes of service.


Unknown-1.jpeg

There are really three distinct paradigms for using storage in the cloud. One is that the cloud storage is being used by a cloud-resident application (or multiple applications that run on the same cloud-resident middleware and dataset).  Another is that the cloud storage is used in a simple way as an extension to or tier of an on-premise data store, without any active agency in the cloud – in this case the storage is used directly via external interfaces. A third model, and what I think is the most interesting model, is where the data stored in the cloud is an asynchronous replica of data stored on-premise (the opposite can also be true, and is also interesting), but where the replica is directly accessible and usable by cloud based applications and agents.  A variant of the hybrid model leverages cloud compute with co-located private storage, e.g. NetApp Private Storage.

 

In this model, we really get tremendous flexibility.  The on-premise data can be the primary store for mission and business critical applications.  The cloud store can be used for disaster recovery, backup, archive, analytics, test, and many other uses.  These have two characteristics that make them very suitable for the cloud: they require a lot of compute horsepower sometimes but not always, and they can work off a recent point-in-time image of the data which may be slightly behind the on-premise version.  For applications and use cases that have these two characteristics, the cloud can offer compelling benefits.

 

To really achieve full portability of data to and from the cloud, there are three areas that will be foci for innovation:

  1. A quantitative, normalized description of storage service levels that can be compared across the spectrum of cloud vendors and on-premise and co-located storage systems.
  2. A means of evaluating the entire cost of storage and the cost of data movement to select the placement that optimizes cost for the value delivered.
  3. An efficient mechanism for moving point-in-time images to and from the cloud, including both bulk and granular transfers. Here, efficient means moving less data with less impact on the on-premise systems and lower compute and storage costs in the cloud.  The more efficient the image transfer, the more economical it will be to leverage the capabilities of the cloud. 

 

The cloud is a transformation of IT that will continue to impact the way things are done on-premise.  On-premise data centers will not disappear, at least not any time soon.  They will adopt the same technologies as are used in the cloud, both to increase internal efficiencies to match the efficiencies of the cloud, and to enable better participation in hybrid on-cloud/cloud co-location and on-premise infrastructure deployments.  It’s going to be interesting to see how this all plays out, but there’s no doubt that the cloud will continue to play a large and growing role in IT in the coming years.  Data management is one area where the rapid evolution of the cloud in conjunction with a large continuing on-premise IT footprint presents some of the most interesting technical challenges we face in storage today.

More