NetApp 360 Blog in NetApp BlogsWhere is place located?

Log in to ask questions, participate and connect in the NetApp community. Not a member? Join Now!

Welcome to NetApp 360

NetApp 360 is your go-to resource for everything NetApp: what's happening at the company, the impact of our innovation for customers and partners, what makes NetApp a model company, and of course why our employees love working here.

Recent Blog Posts

Refresh this widget

NA_Globe_Lifted_HiRes_RGB.jpgAs the pace of business increases, competitive advantage will increasingly hinge on the ability to access and strategically harness information.

 

The innovations and advances in today’s technology are happening so quickly and so consistently, it is almost like being on an airplane. During the flight, it feels like you are sitting still, even though you are moving more than 600 miles per hour.

 

Only a few years ago, the cloud was most closely associated with “storage.” Organizations that were suddenly able to gather, retain and generate massive amounts of data, with and on more devices and platforms than ever before, began to wrestle with the basic challenge of where to put it all. For many, the answer was a new and almost mythical place with a fairy tale name: “The Cloud.”

 

At least in the public consciousness, the concept of the cloud appeared so quickly within the dizzying explosion of data usage that many people’s perception of what it is and what it does remains, no pun intended, quite cloudy. Shaped by personal computing associations and terminology (“backup to iCloud”), the general comprehension of it can be paradoxically simplistic and vague.

 

It is easy to see why the complexities of cloud deployment can get overlooked. But ignoring these complexities can lead to significant risks. Many organizations that charge into cloud deployment haphazardly quickly find themselves mired in costly innovation-stiflers like vendor lock-in and disparate cloud architectures. Worse yet, they place their mission-critical data at risk or out of reach. Some lose it entirely.

 

With a basic understanding and some thoughtful planning, though, all this can be completely avoided. Because what the cloud is, is quite simple. But what it does, when used correctly and strategically, is actually quite profound. And it is fundamentally reshaping business intelligence, removing blocks to innovation, and transforming data’s role in success.

 

First, the simple.

 

When people discuss putting something in “the cloud,” all it means is that the information will reside outside an organization’s firewall, often in space rented on-demand from large scale utilities like Google, Amazon or Azure. That’s it.

 

Having data in the cloud means it is globally accessible and flexible, with extreme ease of access and scalability. In the past, organizations would have to buy, build, house and maintain a data center. It was a labor-intensive, painstaking and costly endeavor. Today, a rental contract with a cloud service provider can be signed quickly, offering instant global reach and accessibility.

 

For smaller organizations and startups, it is often an easy decision that has leveled the playing field and offered unprecedented access to hungry, young innovators with software solutions. Many larger enterprise organizations are realizing the same advantages by migrating their data into the cloud – often, into a hybrid cloud architecture that combines on- and off-premises deployment.

 

This small but critical distinction – the ability to seamlessly house and access data between multiple cloud resources in a hybrid configuration – has had enormous consequences. It has transformed data into a new kind of global currency and immeasurably increased the importance of IT to business success today.

 

The ability to instantly access mission-critical data creates both opportunity and risk in a cloud deployment. How efficiently and effectively an organization is able to tap, parse and derive intelligence from data is now more critical than ever before. By logical extension, an organization’s ability to restrict access and protect that data is equally important.

 

Why?

 

Because data, now a fundamental driver of business success and competitive advantage, has become extremely valuable. It has moved far beyond “storage,” with all its associations of a dusty warehouse filled with old files. In the relative blink of an eye, technology innovation that allows for incredible insight to be derived from information has harnessed data into a valuable corporate asset that needs to be retrieved efficiently and guarded carefully.

 

BigData_Content_2_HiRes.jpgSimply put, data is now money.

 

But ironically, given the speed and dexterity of technology today, the money that data most closely resembles is not computerized banking or split-second financial industry trading. Data is more like cold hard cash. It is a physical thing – heavy, cumbersome and hard to move. Large data transfers can take days, even weeks. You don’t want to move it very often and you don’t want to move it very far.

 

Now more than ever, data needs to be carefully managed across different “boundaries,” without losing control. Data is extremely valuable and personal, but it is also extremely cumbersome.

 

In order to best manage data’s value, some organizations are moving away from being the builders and maintainers of data centers themselves, becoming brokers of services that enhance business performance. As a result, there are now many new service providers and vendors competing to meet these needs.

 

For successful strategic cloud deployments, organizations must be able to quickly assess and assign value to the different types of data they generate and collect – and draw from that analysis to designate a location for that data.

 

They then must assess vendors to determine which should be approved, and for what kind of information. Vendors are evaluated across a variety of criteria, including the country they operate in, their corporate history, their security protocols and even their proximity. If they meet all requirements, they are approved for use by the company’s employees.

 

This is the essence of what is now called “data stewardship.” You can offload the infrastructure, the applications, the services, but you can never offload or outsource “control” and responsibilities of important data.

 

It is ironic that all this important and game-changing management of information is all unfolding under the warm and fuzzy term “the cloud.” It’s true that what is essentially a simple concept – let’s take the data we have behind our firewall and move it to a place where it is more accessible, flexible and scalable – can get complicated quickly. Within these many levels of complexity, though, are incredible new strategic advantages waiting to be unleashed. Smart, conscientious cloud deployment can drive greater value to and from a company’s information, while tapping the incredible power of innovation to drive real impact.

 

The sooner organizations realize what the cloud is – and what it is not – the more effectively they will be able to use the power of information to their advantage. As the fog of the cloud continues to clear, its complications are simplified and its incredible benefits are revealed.

By Peter Corbett, Vice President & Chief Architect at NetApp

 

The concept of flexible computing in the cloud needs no introduction here. Many businesses are being built from the ground up on cloud infrastructure, both for production IT and internal IT operations.  But the vast majority of large enterprises and organizations still run much of their IT operations internally.   I doubt that any of these organizations haven’t given some thought to how they could leverage the cloud for at least a portion of their IT needs.  But, an IT shop looking at moving some applications to the cloud faces several challenges.  Among these are several directly related to data storage and transfer.

 

Of course cloud providers offer storage to go along with the flexible compute capabilities they prominently feature.  They have developed pricing models that account for ingest, access and retention of data.  They offer different classes of service in terms of durability (the certainty that the data you deposit will not be lost), availability (the ability to access data at any time), performance (the latency and throughput at which data can be stored or retrieved), the storage access protocols and semantics supported, and other service level attributes.  Organizations with internal IT footprints face the same set of decisions about service levels, but these attributes historically have not been uniformly articulated and quantified across different storage system manufacturers. It is clear that the cloud providers are driven to optimize and refine their pay-as-you-go pricing models, and this has led to a more defined articulation of the service levels provided by their different classes of service.


Unknown-1.jpeg

There are really three distinct paradigms for using storage in the cloud. One is that the cloud storage is being used by a cloud-resident application (or multiple applications that run on the same cloud-resident middleware and dataset).  Another is that the cloud storage is used in a simple way as an extension to or tier of an on-premise data store, without any active agency in the cloud – in this case the storage is used directly via external interfaces. A third model, and what I think is the most interesting model, is where the data stored in the cloud is an asynchronous replica of data stored on-premise (the opposite can also be true, and is also interesting), but where the replica is directly accessible and usable by cloud based applications and agents.  A variant of the hybrid model leverages cloud compute with co-located private storage, e.g. NetApp Private Storage.

 

In this model, we really get tremendous flexibility.  The on-premise data can be the primary store for mission and business critical applications.  The cloud store can be used for disaster recovery, backup, archive, analytics, test, and many other uses.  These have two characteristics that make them very suitable for the cloud: they require a lot of compute horsepower sometimes but not always, and they can work off a recent point-in-time image of the data which may be slightly behind the on-premise version.  For applications and use cases that have these two characteristics, the cloud can offer compelling benefits.

 

To really achieve full portability of data to and from the cloud, there are three areas that will be foci for innovation:

  1. A quantitative, normalized description of storage service levels that can be compared across the spectrum of cloud vendors and on-premise and co-located storage systems.
  2. A means of evaluating the entire cost of storage and the cost of data movement to select the placement that optimizes cost for the value delivered.
  3. An efficient mechanism for moving point-in-time images to and from the cloud, including both bulk and granular transfers. Here, efficient means moving less data with less impact on the on-premise systems and lower compute and storage costs in the cloud.  The more efficient the image transfer, the more economical it will be to leverage the capabilities of the cloud. 

 

The cloud is a transformation of IT that will continue to impact the way things are done on-premise.  On-premise data centers will not disappear, at least not any time soon.  They will adopt the same technologies as are used in the cloud, both to increase internal efficiencies to match the efficiencies of the cloud, and to enable better participation in hybrid on-cloud/cloud co-location and on-premise infrastructure deployments.  It’s going to be interesting to see how this all plays out, but there’s no doubt that the cloud will continue to play a large and growing role in IT in the coming years.  Data management is one area where the rapid evolution of the cloud in conjunction with a large continuing on-premise IT footprint presents some of the most interesting technical challenges we face in storage today.

Every Monday we bring you top stories featuring NetApp that you may have missed from the previous week. Let us know what interests you by commenting below.

 

New Data Center Innovation Stack Could Redefine IT - eWeek

NetApp and eWeek collaborate to offer 10 key data points for IT managers to consider as they investigate the deployment of the new data center innovation stack.


Cloud_News_1_HiRes.jpg

How Great Companies Become Great -- Three Leaders Share Unique Practices - Forbes

What makes a company great? Tom Mendoza joins a discussion of leaders to discuss what's behind NetApp culture.

 

Interview: NetApp and Policy-Based Data Management for the Enterprise - Inside Big Data

Inside Big Data catches up with Richard Treadway to discuss NetApp and policy-based data management for the enterprise.

 

Going For The Gold: 2014 Channel Champions - CRN

NetApp has moved up into the No. 2 position on this year's CRN Channel Champions awards in the Enterprise Network Storage category!

 

Hybrid Cloud Storage Buying Guide - Enterprise Storage Forum

The concept of hybrid clouds is catching hold! Enterprise Storage Forum spotlights NetApp as one of the companies offering hybrid cloud storage.

 

InformationWeek Elite 100 - InformationWeek

NetApp has been named to this year’s InformationWeek Elite 100, a selective, annual ranking of business technology innovators.

 

Best Places to Work 2014: NetApp Inc. - Crain's Chicago Business

Crain's Chicago ranked NetApp No. 8 on their 2014 “Best Places to Work” list!

 

NetApp's Connie Brenton | Women of Influence 2014 - Silicon Valley Business Journal

Silicon Valley Business Journal spotlights 100 women in the 2014 Women of Influence class who are remaking Silicon Valley, including NetApp’s Connie Brenton.

image00.jpgBy Matt Brown, Senior Program Manager, NetApp on NetApp

 

NetApp IT knows the intrinsic value of bringing together IT professionals from different companies to share strategies, challenges, and experiences. As conversations unfold, we discover the passion IT professionals have for their jobs and for their desire to talk with credible peers. This is a terrific way to compare and benchmark with other IT shops, spark new ideas, and drive innovation and efficiency.

 

Collateral_2_HiRes.jpg

In my 25 years as an IT professional, I have found that many IT departments struggle with similar problems and challenges. The only difference is the scale, complexity, and maturity of each organization. We are all trying to accomplish the same things:

 

  • How do we as an IT organization enable the business to scale and grow?
  • How do we align to the business and to our corporate strategy?
  • How do we manage costs and increase agility?
  • How do we deliver the best possible services to the business?

 

Our NetApp on NetApp program is supported by a group of IT subject-matter experts who have volunteered to help other IT organizations struggling with similar challenges. We share our real-world experiences operating a global enterprise using industry-leading technology, including NetApp storage solutions, and our business case for implementation. Often, the conversations start with a request from our sales organization to help a NetApp customer with a problem.

 

And yet, we are not sales or marketing. We are a group of real-world IT practitioners who enjoy sharing our passion, knowledge, and strategies with other IT professionals.

 

We invite you to read other NetApp on NetApp blogs on our experiences in engineering and in running a Fortune 500 company, while maximizing the value of NetApp technology.

image00.jpgBy Frank Pleshe, Technical Marketing Engineer, and Philip Trautman, FAS Product Marketing, NetApp

 

Part 1 of a multi-part series on storage networking at NetApp

 

Considering how integral networking is to the success of network storage, it’s surprising how little time some of us in the storage field spend thinking about it.

 

But, owing to the sub-millisecond latencies now achievable with flash, network performance is more critical than ever to overall storage performance. At the same time, the IT landscape is evolving at a rapid clip – creating a need for greater network flexibility.

 

With new data types and mountains of data to contend with, we at NetApp have to constantly evolve our thinking – and our products – to deliver the network connectivity you need now and in the future.

 

Bringing Network Innovation to Storage

NetApp has been a long-time leader in storage networking innovation and standards efforts*.

 

Physical Layer. NetApp was among the earliest adopters of both Gigabit Ethernet and 10 Gigabit Ethernet technologies, and was also the first with 8Gb/sec FC and one of the first with 16Gb/sec FC.

 

NetApp is also the first storage vendor to introduce flexible onboard ports and adapter cards to support both block and file-based storage. With Unified Target Adapter 2 (UTA2) technology, the same port can operate either as 10GbE or as 16Gb/sec FC with a simple optics change.

 

Given our long-standing commitment to unified storage on our FAS platform, maybe this isn’t so surprising. Doing both SAN and NAS equally well requires a higher level of flexibility.

 

NAS Protocols. NetApp’s NAS contributions are well known. Senior Vice President, Brian Pawlowski, was a major contributor to NFSv3, and Senior Technical Director, Mike Eisler and others, made significant contributions to both NFSv4, NFSv4.1, and pNFS. NetApp was the first to bring CIFS to NAS on an equal footing with NFS, and the co-inventor of NDMP (network data management protocol) along with Legato Systems.

 

SAN Protocols: On the SAN side, NetApp was the first to ship iSCSI and the first to deliver native FCoE. We’ve continued to push hard in the face of sometimes-significant headwinds to make FCoE a success.

 

What Is Past is Prologue

Clustered Data ONTAP and the scale-out architecture of the FAS8000 are the culmination of all of NetApp’s storage networking efforts to date. The FAS8000 supports hundreds of network connections and tremendous network bandwidth. In coming posts in this series we’ll explore the networking capabilities of the FAS architecture and clustered Data ONTAP in more depth to help you understand how NetApp helps you make the most of every network resource. Topics will include:

 

  • Using a low-latency 10GbE non-blocking switch infrastructure as a cluster interconnect
  • Understanding clustered Data ONTAP networking concepts (ports, LIFs, IFGRPs, VLANs, failover groups, etc.)
  • Load balancing

 

Stay tuned for more.

 

*See a complete list of NetApp innovations and the standards organizations that NetApp participates in.

More