?

NetApp Insight 2019 - Keystone, Platform Updates, Cloud and More!

A summary of the announcements made at NetApp Insight 2019 around new platforms, cloud solutions, the new Keystone offering and more.

As NetApp Insight 2019 came to a close, I couldn't help but look back at all the announcements made during the week and be amazed with the sheer amount of innovation and R&D brought to bear this year. The Mandalay Bay was a familiar site for those who have attended in the past and the event was very well attended this year.

The tone felt somewhat different from past years though. An increasing number of the session topics catered to software developers and DevOps engineers, whereas in the past the topics where squarely devised to attract the storage and infrastructure engineers.  

The highlight of the week was obviously the keynote session on Tuesday morning hosted by George Kurian, NetApp CEO and president. This year's keynote was condensed to a single 90-minute address where Kurian discussed NetApp's past commitments and associated achievements; the changing landscape of our industry; and the need for greater integration, speed of delivery and standardization. Sprinkled throughout the presentation, Kurian mentioned some of the key announcements this year, which we will review below. Lastly, Kurian brought four customers on stage to share their experiences: Eric Sedore, Syracuse University's CIO; Scott Freeman from Centura Health; Lalit Patil, CTO for HANA enterprise cloud at SAP; and the closer, Kate Swanborg, technology communications executive at DreamWorks. The external validation from those presenters delivered a powerful message to the audience.

If you didn't get the chance to attend NetApp Insight, I highly recommend you watch the recording of the keynote session.

What is Keystone?

Keystone was by far the single largest and most impactful announcement at NetApp Insight this year. Kurian talked a bit about Keystone during the keynote session, but did not have enough time on stage to do it justice. Keystone is a reflection of how NetApp has been transforming their business to broaden their customer base and address new markets. The intent is to deliver a better overall experience to their customers by eliminating a lot of the key reasons why customers end up purchasing a competitive solution. In my opinion, Keystone can be broken into three tenets: 

  • Flexibility
  • Customer confidence
  • Simplicity

The increased flexibility brought through Keystone is by far the most important aspect of the program and will manifest itself in a few different ways, the first of which is flexible credits. For customers purchasing new A-Series systems from NetApp, if the customer maintains support edge premium or expert on their system for a period of six years, they will be entitled at month 39 to a credit (dollar amount based on the platform) to replace their controllers; an obvious and effective jab at Pure's evergreen program. 

The most interesting aspect of the flexibility Keystone will provide is around consumption models. NetApp has been very creative with how they will offer consumption of their data management platform going forward. They are effectively enabling their customers to pay for the consumed equipment in two ways: CapEx or OpEx. 

CapEx purchases are simply what customers have been doing for decades, meaning upfront or financed capital purchase of equipment where the customer installs and manages the equipment in their data center. 

The OpEx model is where the creativity really comes into play. As we know, you can already consume NetApp capacity in all three major cloud service providers through an offering called Cloud Volumes Services (CVS) and Azure NetApp Files (ANF) which provides an excellent CIFS/NFS and multi-protocol storage platform for customers that develop or migrate applications to the cloud. Keystone builds on that offering by creating an on-prem version of that service where customers can consume storage services from an on-prem appliance NetApp will install and operate in exactly the same way CVS/ANF is managed in the cloud.  Furthermore, they have an identical offering where customers can manage their own storage, but pay based on the space consumed.

NetApp Keystone consumption models
Keystone consumption models

NetApp is making the customer confidence aspect of Keystone very appealing for non-NetApp customers, removing the initial leap of faith required to make the move to a new platform. They have built on the efficiency guarantee that was initiated many years ago to provide customers with guarantees around performance, availability, satisfaction and, of course, efficiency. NetApp has simplified the process for their customers to include those guarantees in their transactions and have made their terms and conditions simpler. This is meant to eliminate any uncertainty in customers' minds and build confidence in their decision to invest in an infrastructure built on NetApp products and services.

The simplicity aspect of Keystone will not be as apparent to customers as a lot of it is back-end optimization to provide a better experience to the end customer. The majority of the perceived changes will be around simplification of quotes (for the newly announced systems), quicker quoting turn around and better support options for experienced customers. The enhancements in the support experience will be provided via the offering of a new support level called "expert" that will enable customers to have software-related cases skip level-one support entirely, a welcomed change for long-standing customers.

Keystone really denotes how NetApp has evolved over the last two years. They have brought the level of innovation they have been renown for from product engineering and applied it to other aspects of their business. In my opinion, this effort was devised based on years of customer feedback about lack of flexibility in NetApp's original NetApp On Demand (NOD) program and the ever increasing demand we are seeing for managed services on premise. I do believe this will provide NetApp with a tool no one else in the market has today and will once again leave other OEMs playing catch up.

What's new with NetApp platforms? 

 

ONTAP

New mid-range (A400/FAS8300) and high-end (FAS 8700) platforms running data ONTAP will provide a good increase in specs and associated performance.  The addition of those platforms to the portfolio make for a crowded portfolio that will probably see some rationalization in the near future.

All-flash object storage 

This is one of the more interesting platform announcements this year. NetApp has been doing very well with their object storage platform in the past few years. Adoption has been growing, and StorageGRID's features and performance makes it one of the (if not THE) best object platform in the market right now. The introduction of an all-flash version of their StorageGRID appliance depicts the changing face of workloads leveraging object storage. In our opinion, the timing of this announcement could hardly be any better. 

We are getting an increasing amount of questions about high-performance object storage during our object workshops - often from customers asking about using high-performance object to store images for use with AI algorithms or the likes. Flash object should provide much lower TTFB (Time To First Byte) as well as better metadata access time. Given the massive increase in object storage use cases, customer POCs and purchases, I believe this to be extremely timely for NetApp.

All-SAN Array

This announcement was probably one of the most welcomed ones of the week. NetApp has decided to make a SAN-only version of their AFF platform and introduce symmetric active/active path for continuous availability. NetApp has been improving cluster fail-over for years, and in the latest releases of ONTAP, we have experienced a very reliable less than 10 seconds fail-over time. Even though NetApp's ONTAP SAN business has been growing steadily and at a fast pace for the last two years, they have faced struggles with customer's tier-1 SAN workloads because of that fail-over time. I believe this announcement will make it possible for NetApp to compete for some of the most demanding workloads that have long been out of reach for them and enable them to accelerate their momentum with SAN workloads.

QLC adoption in 2020

The introduction of QLC (quad-level cell) media isn't really anything to write home about. It will not enable new features or higher performance (actually QLC drives are slower than the currently used TLC drives), but it will enable NetApp to reduce the overall cost of manufacturing their platforms. A few other OEMs have been using QLC already; other more conservative OEMs will take longer to do so but will get there eventually. QLC memory does have a shorter lifespan than TLC (triple-level cell), but I think the industry has proven that TLC memory has much greater durability than the typical lifespan of the devices in any data center. This should also enable even larger drive sizes in time.

MaxDATA for NetApp HCI

If you are not familiar with MaxDATA, I don't blame you. NetApp hasn't been very vocal about it given they have preempted the wide availability of DCPM (Data Center Persistent Memory). MaxDATA is a software that can be installed on a system (bare metal/Linux or VMware) that will leverage installed DCPM to create a filesystem that spans the host's DCPM and an external block device. MaxDATA will localize highly-used data on the DCPM on the host to reduce the reliance on the shared external storage for IOPS/performance. This translates in the ability for systems running MaxDATA to deliver many folds more IOPS than any array ever would with mere tens of microseconds of response time. In my opinion, bringing MaxDATA to their HCI systems, NetApp is providing another tangible differentiator to their HCI offering.

HCI and SolidFire external key management & FIPS

NetApp HCI and SolidFire platforms have historically relied on an on-board key management system for encryption. This year's announcement will enable those two platforms to leverage external key managers and achieve FIPS certification.

NetApp's cloud updates

Cloud Secure for Cloud Insight

Cloud Secure is a new component of cloud insights that leverages machine learning to identify insider threats, potential external breaches and help with cloud compliance and industry regulations such as PCI, HIPPA, GDPR, CCPA and more.

Azure NetApp Files Fed Ramp

As the title suggests, this announcement surrounds the ability for Microsoft to deliver the first party server Azure NetApp Files on the federal government version of Microsoft Azure.

FlexPod and HCI solutions

NetApp announced in October a slew of solutions based on FlexPod and NetApp HCI. Those solutions are meant to address very specific but broad uses such as:

  • ONTAP AI for automotive and healthcare
  • FlexPod with SAP HANA
  • FlexPod with NKS
  • Citrix VDI for NetApp HCI
  • End-to-end DevOps CI/CD on NetApp HCI