1 2 3 5 Previous Next

NetApp Government Gurus

69 Posts

In a recent GovDataDownload discussion with Greg Gardner, NetApp’s chief architect for government, defense, and Intel solutions for the U.S. Public Sector, Gardner discusses the cloud, cyber security, and storage-related challenges for the Department of Defense (DoD) and Intel space.

 

Gardner shared with that the core of the issue for the DoD and Intelligence space is balancing the options offered by cloud and Big Data with the limited budgets available to them.  He shared that cloud offers cost savings, but it is not necessarily a panacea given that critical and sensitive data needs to be secured and closely managed.

 

Gardner says that keeping information secure in the cloud will continue to be a challenge for CIOs, particularly when you take into consideration the fact that many agencies are running three Petabytes of data on their networks every month.

 

In addition to securing that Big Data, CIOs must determine how to analyze it and make it usable.

 

Hear more from Gardner in his GovDataDownload podcast here.

Cloud_Lock_2_HiRes[1].jpg

Like the Feds, State and Local governments see the benefit of cloud computing but they haven’t made progress like their Federal counterparts. The tortoise is trying to catch the hare – and it’s not often the Feds get to play the role of the hare.

 

Well, move over Feds, cloud is a huge strategic goal for State and Locals.

 

The National Association of State CIOs said cloud computing is their top tech priority for 2014. They see the advantages in terms of saving money through shared services, improved security, more robust infrastructure, and operational efficiencies generally - tons of upside.

 

But our public-sector friends are seldom early adopters, and they continue to struggle with transitioning to the cloud. Like turning a cargo ship, it takes a while.

 

State CIOs have identified the bureaucratic landscape as a big hurdle. They have to navigate within the regulatory and contractual framework, and that isn’t easy. The procurement process and pathways can tend to drive the boat – or cargo ship – at the state level.

 

Believe it or not I have had CIOs tell me that they are choosing or being dictated to choose their cloud solution based on the lowest acquisition cost and ease of procurement, and not best value and ability to deliver.

Building_1_HiRes[1].jpg

 

I understand the need for prudence since IT spending by State and Local governments is expected to increase only by 1.5 percent this year, but that underscores the need for creative new alternatives like a comprehensive cloud solution that will offer operational efficiency, reduced time to market, and elasticity, all likely leading to significant cost savings over time. The benefits of cloud cannot truly be measured in the commodity terms of dollars per bit, byte, or CPU, but instead will ultimately be measured in how it completely redefines and creates the new “norm” in the way the public sector conducts its IT business.

 

State and Local CIOs also grapple with the idea of letting go of their data to store it in the cloud. A little skepticism is good. Data is their most important asset, so they need to know that it’s safe.

 

There also needs to be a detailed exit strategy – it can’t be a door that locks behind you. I am reminded of the telecom boom in the early 2000s following government deregulation of that industry. In a matter of months you had hundreds and thousands of start-up service providers in every city and town. Fast forward a couple of years and you had very few survivors when the market was no longer shiny and new and had to be run like a real business. Many organizations did not have an exit strategy and either suffered painful transitions or paid through the nose for easier ones as a result.

 

Cloud computing can be a tremendous asset for State and Local governments, but they need a guiding hand.

 

It’s not about public cloud vs. private cloud vs. hybrid cloud. It is not one size fits all. A successful cloud strategy will likely involve all of the above, as well as on-premise infrastructure. Perhaps more than any other technology shift in our lifetime, cloud will require internal and external collaboration and partnering at new levels requiring stringent SLAs and low egos!

 

Making sure State and Local IT leaders understand that there is an ecosystem of partners to help them sort out their options and craft their cloud strategy is critical to ensuring that they get the most out of the cloud. And maybe save a little money.

 

For more insight, follow the conversation about cloud computing at NetApp or MeriTalk’s Cloud Computing Exchange http://meritalk.com/ccx and the Cloud Computing Caucus Advisory Group http://cloudcomputingcaucus.org/

 

Shawn Rodriguez, Director, State and Local Government and Education, NetApp

statescoop50.JPGHow does NetApp support state and local governments? What is the biggest trend in this space? How will technology change state government over the next decade?


Mark Weber, President of NetApp U.S. Public Sector and StateScoop50 nominee, answers these questions in a podcast with StateScoop. Listen here- http://statescoop.com/meet-statescoop-50-nominees-netapps-mark-weber/


Tasked with delivering new information technology solutions and learning opportunities on tight budgets, U.S. public institutions are put in a difficult position. State governments are looking for flexible, efficient, and secure data statescoop1.JPGmanagement solutions.


State and local leaders are focused on improving government performance through technology and efficiency. As state budgets stabilize, Mark sees these major trends shaping this space:

 

  1. Shared Services
  2. eGovernment
  3. Security
  4. Cloud

 

Check out the podcast to hear more about how these trends are impacting state government and how NetApp is able to confidently drive change within state government IT.


Mark is nominated for the StateScoop50 Industry Leadership Award. Don’t forget to cast your vote for Mark Weber before April 18th.

IMG_0508.JPGFrom the moment we step out of bed in the morning until the moment we go to bed at night, we often get distracted by action items, steps, and cycles that need to take place.  Much of our days are focused on checking off our “to do” lists but we are much more than our “to do” lists.


We are defined by how we give back to those from whom we expect nothing in return. At NetApp, I am continually impressed by how the company and employees live beyond their “to do” lists and strive to affect positive change through philanthropic efforts.


During this time every year, NetApp focuses philanthropic efforts on cancer awareness, prevention and treatment. Passionate employees choose to support St.Baldrick’s and many other non-profit organizations that fight cancer, including the American Cancer Society, Gastric Cancer Fund, and CancerCarePoint.

 

The St.Baldrick’s Foundation is a volunteer-driven charity committed to funding the most promising research to find cures for childhood cancers and give survivors long and healthy lives.


Did you know that worldwide, a child is diagnosed with cancer every three minutes? Astonishing facts like this shock me and inspire me to help. That’s where St. Baldrick’s events come in.


St. Baldrick’s empowers volunteers to host head-shaving events. Volunteer “shavees” sign up to get their head shaved and raise money from friends and family like you would for a walk-a-thon. The funds raised are donated to St.Baldrick’s because they direct every single dollar possible to carefully selected research grants.


I am pleased to announce that once again, passionate employee volunteers are making the 5th annual DC Metro St.Baldrick’s event possible. Our primary focus is to have the greatest impact on childhood cancer worldwide while having fun!

IMG_0519.JPG


Every year there are moments in which the meaning of St. Baldrick’s slaps me in the face. I want to share a few that I’m most proud of:

  • Number of childhood research grants made possible thanks to the generous donations from NetApp employee volunteers, their friends and families – over $5 million has been donated since 2007.
  • Number of people who have shared their personal battles with cancer. It is unfortunate but there are a lot of them. 
  • Countless families our volunteer efforts have positively impacted through the years.

     

  • When women volunteer to face the razor, and shave their heads.
  • Watching volunteers set up the event, participate, and break everything down just to drive over an hour back home. No matter the distance, people come together for this worthy cause; it is something special. 


I challenge you to step back from your “to do” list and do something that you’re passionate about. #NetAppCulture

 

Adam Mellor, Sales Representative, NetApp U.S. Public Sector

The cloud still looks like the Wild West to many Feds– unsettled and lawless.

 

Keep your head down. Watch your back.

 

Concerns about data security – primarily data loss and data breaches – represent their major worries. Giving up control of data is a big step for agencies and not one Feds take lightly. They want to ensure that their data will be secure in the cloud and portable in case they want to saddle up and switch cloud service providers later.

 

Agencies also are wary of the cloud environment because they don’t have full visibility once they let go of the data. Since they can’t see under the covers, they don’t know the hardware their cloud service provider uses or the underlying security in place to protect data and networks.

 

It’s a leap of faith.

 

Public and hybrid clouds are the main culprits. But there are sheriffs in town, reliable technology solutions to give agencies the tools they need to keep track of data – and keep the peace.

 

Secure multi-tenancy goes a long way toward addressing the major concerns agencies have expressed. Since multiple tenants can use space in the same public cloud -- lots of people in one teepee – secure multi-tenancy is a commonsense solution. It provides cloud users with clear, unimpeded visibility of their slice of the cloud. It also gives cloud users an opportunity to ensure that their cloud has a trusted hardware stack – with all the elements in the stack working as they should.

 

Migrating to the cloud is a big step for many agencies, but it shouldn’t be a step into the unknown.

 

Agencies don’t want to feel like they’re heading into the Wild West when they journey to the cloud. They want cloud security solutions that make them feel at home.

 

If you want to stay involved, follow the conversation about cloud computing at MeriTalk’s Cloud Computing Exchange http://meritalk.com/ccx and the Cloud Computing Caucus Advisory Group http://cloudcomputingcaucus.org/

 

Lee Vorthman, Chief Technology Officer, Federal and Civilian Agencies, NetApp U.S. Public Sector

Gold_Medal_Ribbon_HiRes.jpg

GovDataDownload recently presented a podcast with Travis Howerton, Chief Technology Officer at the National Nuclear Security Administration (NNSA).

 

At NNSA Travis serves as the RightPath Program Manager leading the Department of Energy (DOE/NNSA) IT modernization efforts. In this interview he discusses his recent Fed 100 recognition and how experience in the private sector helps him in his current role. Travis also explains the innovative cloud storage brokerage model at NNSA, and the security considerations federal IT executives need to consider when moving to the cloud.

 

Listen to the interview with him here.

mwallace

Getting Rational

Posted by mwallace Mar 9, 2014

The benefits of Application Rationalization are well-known. Eliminating redundancy and removing outdated software saves money and storage space.

 

There are other benefits as well, and the current migration to the cloud that so many Federal agencies are engaged in serves as a reminder of those benefits and other emerging issues that require careful consideration.

 

My colleague recently wrote about how quickly Federal agencies are migrating to the cloud and how important it is that they consider the challenge of data portability.

 

Since hybrid clouds are likely to be deployed broadly throughout the Federal space as more and more agencies migrate to the cloud, it is even more important that they think about the future. Hybrid clouds make sense in many instances. Many applications can run safely and securely in an off-premises hybrid cloud. Other applications make sense to keep on-premises.

 

Application Rationalization is important in terms of helping organizations get a handle on their inventory of applications – streamlining and consolidating – and it must be part of the cloud strategy.

 

Agencies also must ensure data continues to flow unmitigated, no matter where that information and software reside. The cloud environment and a potential glut of applications must never be an impediment to access or to the flow of data.

 

So, taking streamlining and consolidation a step further, it’s important to ensure data portability. Too often when an agency sets up its architecture, it focuses on the efficient delivery of information today. Making sure the architecture is good on both sides of the cloud will allow data and software to flow between different clouds. That is crucial.

 

It’s also important to consider future needs and how the cloud can drive greater levels of service later.

 

As agencies begin to adopt hybrid clouds, they must think about how they will extract data in a highly virtualized environment. Cloud service providers are not the same, and having the ability to shift from hypervisor to hypervisor is necessary to move easily from cloud to cloud and overcome those differences.

 

The future is coming fast. Agencies must make sure they’re ready.

 

We’ll be talking more about Application Rationalization at the upcoming Data Center Exchange Brainstorm on March 13 at the Newseum in Washington D.C. For more information visit:   http://meritalk.com/dcx-brainstorm-2014-program.php

 

Mike Wallace, Director, Systems Integrators and Cloud, U.S. Public Sector, NetApp

Jeff Baxter, Consulting Systems Engineer for NetApp’s U.S. Public Sector, recently shared his thoughts on why data storage is an important issue for government agencies and talked with GovDataDownload about public, private, and hybrid cloud options. He said hybrid cloud continues to be an attractive option for agencies that want to compute in the public cloud, but they still want to keep their data private. While the hybrid cloud model is relatively untapped in the public sector today, Baxter said it offers the most potential going forward for agencies seeking to strike a balance between scalability and security. The hybrid cloud model allows agencies to maintain control of their sensitive data while fully maximizing the cost-saving benefits of cloud computing.Cloud_Pump_1_HiRes[1].jpg

 

He did caution against a rush to the cloud, however, because a rushed cloud deployment often creates an inefficient and fragmented IT infrastructure. Baxter went on to say that agencies should look for technologies that can help their IT organizations transform from builders and operators of infrastructure to providers of responsive IT services to the organization. To do that, they need to find an innovative solutions provider that can support flash-accelerated, cloud-integrated storage solutions for the broadest range of shared and dedicated infrastructure environments.

 

Baxter mentioned that newly available clustered enterprise storage platforms raise the bar for performance and value in shared infrastructures. He advised that agencies should look for a cloud vendor that can provide virtualization and management of storage platforms to enhance their ROI. Another important tip that Baxter shared is that agencies should make it a priority to work with vendors that offer device- and platform-agnostic solutions, because with the rapid acceleration of change in technology in today’s environment, choosing a proprietary solution can be costly and work against you in the long run.

 

Listen to the full interview with him here.

smary

Moving Day

Posted by smary Feb 24, 2014

Moving data is daunting.

 

Like the family getting ready for the dreaded cross-country move with all their belongings in tow, agencies struggle with data portability and data storage. And it doesn’t get easier. Like families that acquire more stuff as they expand and kids grow, agencies continue to accumulate data.

 

But agencies have choices.

 

Cloud computing has advanced quickly. That means cloud options to run agency applications and store data are increasing. FedRAMP’s work is partly responsible for the growth in options. Its exhaustive efforts to thoroughly vet cloud service providers ensure that agencies have secure options.

 

Packing up and moving to a new cloud environment is fast becoming a viable option. Such a move can make fiscal sense or make data and applications more accessible. Storing data in multiple clouds – public, private or hybrid – also is an option that’s gaining momentum.

 

With these big changes on the horizon, agencies will face important decisions. Is it best to leave data where it is, or pack up and move for one of the many cloud provider environments? And what’s the best way to move large volumes of data?

 

Data portability is a challenge, but it’s one that agencies can overcome, and agencies need to be careful that they don’t take a wrong turn. Data is one of an agency’s most important assets, so they have to begin thinking today about how they can move and store it tomorrow.

 

A storage operating environment that can be abstracted and virtualized from the hardware below it helps make data more portable. Not only does that approach enable portability, it allows agencies to store data in multiple clouds. That enables data interoperability and sharing.

 

Agencies will have more and more cloud options, and they will struggle about which cloud providers to use and how to move their data between them. But data portability shouldn’t prevent them from making the first move.

Moving day doesn’t have to be painful.

 

For more dialogue on Cloud Computing, check out the Cloud Computing Caucus Advisory Group. https://cloudcomputingcaucus.org/

 

Mary Jean Schmitt, Federal Business Development Manager, NetApp

As the number one storage provider to the U.S. federal government, NetApp bears a special responsibility to continually drive innovative storage solutions that can improve the performance and flexibility of U.S. Public Sector information systems.


Today, NetApp is proud to announce the next generation in unified scale-out storage systems, with new storage virtualization capabilities and the capability to run the latest release of clustered Data ONTAP, which helps public sector organizations maintain non-disruptive operations, efficiently store data, and non-disruptively operate across private and public cloud environments in the “unbound cloud” era.


The new FAS8000-series of enterprise scale-out storage systems provide improved performance (up to 2x), better flash acceleration with up-to 3x more flash, and superior I/O flexibility. The FAS8000 also supports the new FlexArray Virtualization Software that allows multi-vendor storage systems to be unified and simplified into one pool of software-defined storage.


FlexArray software allows any FAS8000 system to support both NetApp storage as well as storage from multiple other vendors. For example, a federal agency consolidating multiple data centers can now opportunistically re-use storage and bring it under central coordination. This allows agencies to stretch budgets further and avoid under-use of existing storage infrastructures while still taking advantage of all the latest in NetApp storage technologies.


The FAS8000 is optimized to run the latest version of clustered Data ONTAP. Clustered Data ONTAP helps federal agencies scale-out as needed to meet their mission, and eliminate disruption that would normally be required for planned downtime.


The National Ignition Facility at Lawrence Livermore National Laboratory is home to the world’s largest laser— over 100 times more energetic than any previous laser system. They’ve also unlocked the power of clustered Data ONTAP, eliminating up to 60 hours of planned downtime annually with non-disruptive operations. The power of NetApp Flash Pool has enabled them to meet the IOPS and throughput requirements of analyzing scientific data, while keeping storage and power footprint manageable. Flash Pool has reduced peak latency by up to 97%, eliminating the need to pause experiments.


A complete overview of the National Ignition Facility Success Story is available here and watch NIF CIO Tim Frazier discuss how the facility drives energy research and scientific discovery in the video here.


Clustered Data ONTAP also provides the ideal platform for federal or state and local clouds, whether hosted privately, at a cloud service provider, with hyperscale providers, or in a hybrid cloud transversing them all. By providing a “universal data platform," clustered Data ONTAP on the FAS8000 makes it easier for agencies to move to an era of unbound clouds, where data moves seamlessly across scale-out storage platforms and into or out of private and public clouds.


The FAS8000 represents NetApp’s latest entry in a continued two decades of innovation, delivering improved operations, efficiency, and scalability to our public sector customers. This improved ability to deploy a scale-out enabled storage platform with flash acceleration and cloud integration will allow federal agencies, state & local government, and higher education institutions to respond quickly to changing mission needs, reduce overprovisioning of people, time, and a money, and move seamlessly to the era of unbound clouds.


More information on the FAS8000, FlexArray, and clustered Data ONTAP can be found here.

 

Jeff Baxter, Manager, Consulting Systems Engineering, NetApp U.S. Public Sector

We live in world where it is hard to separate traditional space from cyber space.  They have truly become one as we increasingly rely on our extensive digital infrastructure to tie our lives together.  But, have you ever thought about the difference between how we protect our streets and cities versus how we protect their cyber counterparts?

 

Our townships, cities, counties, states, and even our nation have elaborate and interconnected layers to protect us from fires, fools, felons and floods.  Does any such thing exist for that cyber counterpart?  The answer is largely…NO!  Our leadership has recognized that this gap exists, but how do we protect the cyber part of our lives, and still adhere to the constitutional bedrock that makes our nation great?  How do we extend the protection, that I’m sure we all agree is fundamental, from traditional “3-space” to this new cyber domain?

 

I propose the model is one we built our nation around…a militia model.  We must work together to secure our own infrastructures and help our neighbors secure theirs.  It would be blatantly unconstitutional to force a standing post of police, fire, or military “agents” on our digital terrain, but we would likely invite their help when the cyber equivalent of fires, fools, felons, or floods threaten our expanded world.

 

This militia-oriented model is the essence of our free, yet collaborative society, and I think we can use its rich history as an effective model to apply what we learned in building, and then protecting, our great nation’s physical infrastructure to this new dimension of our nation.

 

What do you think?

 

Dave Denson, Big Data Solutions Architect, NetApp U.S. Public Sector

 

DataSecurity_MultiLayer_1_HiRes.jpg

With 2014 upon us, it is the ideal time to assess key trends when it comes to data center consolidation, cloud computing and big data for government.

 

From data center consolidation efforts to the rise of virtualization to help government support mission goals in a more cost-effective manner to the near ubiquity of big data, we are entering a new era of government IT.

 

These are some insights from a recent TechSource podcast with Eric Oberhofer, Director, Data Center Practice, Iron Bow, who discusses key trends in the areas of data center consolidation, big data and cloud computing.

 

Be sure to listen to the full interview here.

 

Also, check out Eric Oberhofer’s recent blog on Government Gurus- Moving to the Cloud? Take Baby Steps not Leaps and Lee Vorthman's, CTO, Federal Civilian Agencies, NetApp U.S. Public Sector, guest blog post on TechSource- Four Top Federal IT Trends in 2014.

System downtime—planned and unplanned—is a fact of life in today’s IT-driven environment. But this downtime, and the associated risk of lost data and increased costs, can be devastating to an organization, especially when mission-critical operations rely on an infrastructure that is always on. How can organizations reconcile these two business realities?

 

NetApp commissioned Market Connections to learn the frequency of infrastructure refreshes and scheduled maintenance; the impact of unplanned system downtime, and the organization’s confidence in its infrastructure.

 

SDS is doing for storage what server virtualization did for servers—enabling the ability to adapt infrastructure at the pace of business. SDS allows for seamless expansion capacity and performance, and it can rebalance critical workloads without downtime.

 

Learn more about SDS in the full research report here.

sds.gg1.JPG

See this post featured on GovDataDownload.

 

The role of the government agencies is to protect and serve its citizens. Doing so in the face of our present day data explosion makes that job more challenging than ever. This data can come from many different sources (friend and foe), can often be unstructured (or, more precisely: non-relational) and increasingly needs to be dealt with in near real-time. Furthermore, in this tough economy, you need to be to properly equipped to accomplish your mission as cost-effectively as possible: Tax revenue is the blood, sweat and tears of its citizen and should never be squandered away. 


Today, data is coming in such large quantities, speeds and variety, that old processes and tools no longer pass muster. Continuing to do the same thing is not an option. Big data analytics helps you meet these pressing challenges, often where traditional analytics tools fall short. Having tools to manage and gain insights from big data provides the potential for you to be more responsive to the needs of your citizens. It does so by both improving existing capabilities and potentially enabling totally new ones.


The “rotary engine”, if you will, of big data consists of high speed analytics, data bandwidth and large content repositories. All three are necessary. Without access to large content repositories for trending information and context, or without the bandwidth to deliver that data to the right place at the right time, and without the scale-out tools to process the information, near real-time analytics is impossible.

 

There is a wealth of tools available to you within this emerging NewSQL and NoSQL analytics market. This market is a growing ecosystem comprised of start-ups, pre-IPO companies, as well as the usual suspects, like Microsoft, Teradata, Oracle and SAP.  As such, you need to choose your technology partners carefully.

 

These are truly “interesting times” (in the Chinese curse sense): On the one hand, very exciting and dynamic as many new start-ups are emerging with new ideas, ambitious plans for IPOs and rapid growth. On the other hand, these times are also very unnerving: These new applications pose new potential security risks, educational challenges (different skillsets required) and, as in past times of technological transformation, you need to be sure the company you partner with will still be around in 5 - 10 years.

 

What happens if you make a big bet on one of these new technologies, only to see it go out of business? Do you have a fallback plan?  Will your storage infrastructure help you or hurt you if you need to quickly rollback to an earlier state?

 

NetApp Public Sector’s focus is to provide a reliable, low risk, enterprise-ready data infrastructure allowing workflows to be created with all of these analytics tools, old and new. As the #1 provider of storage to the Federal government, NetApp has proven itself in this area over the past 14 years, time and time again, in some of the most demanding and hostile environments. We’re not going to abandon you now!

 

Whether its fraud prevention, cyber security, weather prediction, tactical operations, intelligence, surveillance and reconnaissance operations or space exploration, our best-of-breed data storage solutions are continuing to prove their worth.


The latest tool to help you meet these challenges is the FlexPod Select, a collaborative effort of NetApp and Cisco Systems. FlexPod Select is a 4th generation integrated data center infrastructure.  At the heart of this solution is NetApp’s Clustered Data ONTAP operating system, which is hypervisor agnostic, supporting the same non-disruptive operations, storage efficiency and simple management you have depended upon in the past. With it is also our E-Series 5500 platform, with proven capacity density and industry leading sequential I/O performance. The two together, provide a robust data infrastructure that is truly software defined, in the sense that its use is totally determined at runtime, by how the resources are carved up by the orchestration, management and workflow automation tools.


With FlexPod Select, NetApp and Cisco, along with their partner communities, have handled the engineering and integration for you, so you can focus on your mission and your application workflows. The difference between FlexPod Select and previous generations of FlexPod is that it is equipped to handle these challenging big data applications, including support for Hadoop, OpenStack and other Open Source applications.


You mission is always changing. Our vision has not, even in the face of big data: It is to work with our partners to provide you always-on, best-of-breed data management solutions to help you manage your data from cradle-to-grave, scale with you as your mission grows and at the same time drive your operating costs out of the infrastructure. NetApp is unwavering in that commitment to you.

 

Dale Wickizer, CTO, NetApp U.S. Public Sector

 

 

publicsector.jpg

Cloud has been top of mind for agencies for the past four to five years now as they work toward maximizing capacity utilization, improving IT flexibility and responsiveness, and minimizing cost. 

 

As agencies embrace cloud computing, the enthusiasm is checked by the need to address demands for maintaining control, availability, compliance, privacy and security around data and IT systems. There are multiple paths to the cloud, which leaves agencies asking: public or private?

 

But why not make the best of both worlds and leverage public, private, community, AND hybrid platforms?

 

Although computing requirements may vary, an agency’s storage needs are always growing. By using self-service, on-demand, elastic, and pay-per-use service, IT managers can still have their data under their complete control, relieving the stress associated with security, storage, and government requirements.  In essence, agencies are able to use the cloud with private storage. In a Hybrid IT environment, the data still resides within the organization’s firewalls. What is accessed via cloud services is the actual computing itself as well as applications with which to take on the computing chores.

 

Moving to the cloud means adopting multiple cloud resources that span private and public, but agencies don’t want a collection of discrete resources – they want the ability to manage the entire environment from a single purview. And that is what private storage with public compute enables them to do.

 

Researching different cloud platforms can provide your agency with the security it needs, while also achieving the power, flexibility, and cost savings the federal government requires.

So how do you determine which platform fits your agency best? It’s all about the workload.

 

The more things change, the more they stay the same. Workloads are key to cloud computing. Since there are many scenarios for cloud adoption, users need to take a flexible approach, considering the value and tradeoffs workload by workload.

 

Broaden your horizon, because learning about the cloud options available today can help agencies bring together public and private cloud so they get the best of both worlds, which ultimately leads to IT success.

 

Let’s continue the cloud discussion. For more on this topic, join me at the MeriTalk Cloud Computing Brainstorm, Thursday, January 16, 2014 at the Newseum in Washington, D.C. We will talk more about leveraging public and private platforms in the cloud. Register today:  www.meritalk.com/cloudbrainstorm.

 

Davis Johnson, Senior Leader, Service Providers & Cloud Partners, NetApp U.S. Public Sector

Cloud_Partnership_2_HiRes_RGB.jpg

Filter Blog

By author:
By date: By tag: