How WWT’s ATC provides value for customers, partners and WWT employees
Those of you that read WWT’s blogs have heard of our Advanced Technology Center (ATC). It’s an amazingly effective resource for our engineers, our partners and most importantly our customers.
We spend a significant amount of time explaining the value of the ATC and the many ways to make use of it. Here’s an abbreviated list:
- Proofs of concept
- Proofs of functionality
- Upgrade testing
- Lab as a Service
- Integration testing
- Resiliency testing
While those do a good job of explaining what we can do with the ATC, they don’t provide a lot of context about how this shakes out in the real world. So to help explain, here’s a recent real-world example of how a WWT customer leveraged the ATC to decrease risk and significantly accelerate time-to-solution.
The call for help
It’s after hours, my wife and kids are at soccer practice and I’m spending a quiet night at home with a glass of wine and a tax return. I feel some answers to the obvious questions are in order:
- Yes, I still do my own taxes. No, I don’t have any good reasons why.
- Yes, I find wine and taxes go well together. Best not to review the yearly take completely sober.
- It’s a Sonoma Valley -based Cabernet I got from one of those wine curation and delivery services. Just OK, lesson learned.
Anyway, Cisco Webex teams (a great messaging platform) lights up my phone with a notification. It’s from one of WWT’s regional solution architects. He and his account team are in the final phases of a solution sale and receive the dreaded, “Hey, we got a last minute competitive quote for a less expensive alternative” call from the customer. In this case, it’s an unsolicited bid from a relatively unknown vendor with which the local sales team has little experience. Here is the actual (redacted) Cisco Webex teams conversation…
I follow up with a call to the solution architect and explain to him the strengths and weaknesses of the competitive solution. I also cover the results of our RAS (reliability, availability, serviceability) and workload performance testing that were recently performed in the ATC. We do this kind of testing on all products before we make any customer recommendations. For this particular product, our testing showed that it passes our RAS tests very well, but for some specific workloads (that happen to be the ones this customer plans to run) it has some performance issues.
The solutions architect agrees this is important information for the customer to know and understand before making a purchasing decision and asks if I can get on a video call with the customer right away. My 8:20pm decision: to keep doing taxes or get on a call to discuss technology and performance results with a customer.
Like there was any real decision to make – I’m getting on the call with the customer.
Springing into action
Within five minutes we’re all on a video call, spanning several time zones and climates. (Modern collaboration technology is so cool. Iowa weather in March – not so much.)
I start off going over my background – the customer is unimpressed.
I continue by covering what we do with the ATC – the customer is still unimpressed.
I push on by explaining the product’s architecture and performance limitations – the customer is interested but still not impressed.
I then say, “Hey, would you like to see how it performs under a couple workload scenarios?” – the customer is now very interested.
We have a 90-second chat about the specific storage performance test profile, the system configuration (all-flash in this case) being tested and then we have a look at the output.
First, I show that per design we simply change the read/write ratio and maintain an 8KiB I/O size. It’s intended to be a very simple test.
Next, I show changes in CPU and queue depth. Hmmm… things appear to be queuing.
Next up – IOPS. The test profile was only trying to drive up to 10,000 IOPS – nothing for the number of SSDs in this system. But, it can’t maintain even that low level of IOPS. Something’s borked.
And finally – how does this all affect latency, where the proverbial rubber meets the road? As it ends up… not in a good way.
The summary is – at best – the architecture of the vendor’s system makes a bunch of SSDs perform about as poorly as 10K SAS drives. At worst, it’s unusable for any kind of a transactional application, which happened to be what this customer was running.
The customer is now very grateful, interested and fairly impressed.
The ATC allows our sales teams to know that a given technology will be successful. It has already been tested and validated. We don’t recommend technologies that don’t make the cut in the ATC. This leads to better solutions, better long term customer relationships and much higher levels of confidence and trust.
While one could interpret this story as a loss for the vendor in question, it’s actually quite the opposite. Two very positive things came out of this for them:
One, they didn’t sell a product into an environment where it wouldn’t be successful. The last thing vendors want are large customer support and satisfaction problems, especially if they end up being expensive to remediate. Even worse, if the issue ends up getting published online and/or shared within the technology community they can end up with a vendor or product perception issue. In reality, the vendor and product in question are actually quite good when applied to the right use cases. This simply wasn’t their sweet spot. No reason to risk an entire reputation because of one misplaced solution. The ATC can help sort this out before it’s too late.
Two, we went back to the vendor and explained the issue to them, which led to calls with project management and eventually engineering. They’re working on software fixes to improve performance, which we will again be tested in the ATC. In the end, everyone will end up with a better overall product. I should note that this is a relationship we have with many of our partners. Our best partners regularly leverage our ATC facilities and engineers (a small army of them) for early hardware and software testing.
Clearly the customer dodged what could have been a very risky purchase by leveraging WWT and the ATC, but it’s important to understand that this isn’t the fault of the customer or even an isolated incident for customers in general. The alternatives aren’t very appealing:
The customer could get their own test equipment, hire their own test engineers, work through the hardware and software evaluation agreements, wait for equipment to arrive, get training on how to properly use the product in question, work with support on implementation issues, run consistent test scenarios (this is *so much* harder than it sounds), document their results, tear it all down, re-package everything and ship it all back. Oh – and this process will take 3-6 months to do properly from start to finish for each product in question. Then they can make a well-educated purchasing decision.
The customer could skip all of this and rely on what they’re told by the vendors, by the analysts or by other solution providers who don’t have a testing and validation resource as powerful as the ATC. Maybe the purchase works, maybe it doesn’t, but either way it’s a lot of personal and organizational risk to assume.
Or… simply pick up the phone, call WWT and leverage the ATC for what it does so very well: reduce risks and accelerate decisions.