Currently Being Moderated
kidd

Transforming the Engineering Data Center

Posted by kidd in Jay's Blog on Jun 10, 2008 8:21:25 AM

We announced several new products today at NetApp that I am especially excited about. A new midrange storage system, the FAS3100, and a pair of new caching technologies called the Storage Acceleration Appliance (which is an NFS cache appliance that uses our FlexCache software) and the Performance Acceleration Module (which lets you expand the memory cache inside of the NetApp storage controllers). 

 

Many of our customers use NetApp storage systems to hold the data for engineering, HPC, scientific and other "technical" applications.    This includes software development, seismic analysisresearch, genomics, semiconductor design, computer animation, and many more.    These are the applications that drive the revenue of these companies - the top line, as opposed to the applications that drive efficiency of operations - the bottom line.    Anything that can be done to make these revenue producing applications run faster has a meaningful impact on the revenue growth of these companies.

 

Most of these applications need a lot of compute, but most are also constrained by the speed of their storage.    Compute speeds have grown much faster than disk drive IOPS over the past several years so anything that can be done at reasonable cost  to speed up the delivery of data to the app is a good thing.

 

The announcements NetApp made today do just this.    The Storage Acceleration Appliance is an easy-to-manage caching appliance that allows a lot more application servers to get access to the same set of files by spreading copies of them across more storage controllers and more drives.    These appliances can also be deployed around the world to deliver high performance for distributed workgroups.   NetApp uses them in our own software engineering groups.    Since they are caches, the data on them does not need to be managed.   All backup and other management is done on the 'source' system that feeds the caches .

 

The Performance Accelerator Module expands the size of the DRAM cache inside the NetApp storage controllers in a smart way.     Since DRAM is several orders of magnitude faster than spinning disk, a application request for data in cache will be serviced with much lower latency.    More cache means more data blocks will be served from memory.   You can also choose to cache just metadata or both data and metadata.  A test of an average NFS workload doubled IOPS at constant latency.   Nice.   Plus, it can be added to many of the existing NetApp systems installed in the field and we have a software tool to verify in advance that a bigger cache will help.   Even nicer.

 

I love working with the customers who run these types of Apps.    They are working on the frontier of knowledge and are building the future.    They are also constantly pushing the envelope of what their storage and compute systems must do in the quest for faster product development, faster data analysis or better science.     By doing a better job for them, we do a better job for all of our customers.

 

 

Go further, faster.   

Comments

Filter Blog

By date: