In this article

Advanced solutions in one place

Those of you that read WWT's blogs have heard of our Advanced Technology Center (ATC). It's an amazingly effective resource for our engineers, our partners and most importantly our customers.

One row of many, in one of multiple data centers, in one of several buildings

We spend a significant amount of time explaining the value of the ATC and the many ways to make use of it. Here's an abbreviated list of its key capabilities:

  • Demos
  • Proofs of concept
  • Proofs of functionality
  • Upgrade testing
  • Lab as a Service (LaaS)
  • Integration testing
  • Resiliency testing
  • Product comparisons

While this list does a good job explaining what customers can do with the ATC at a high level, it doesn't provide a lot of context about what these capabilities mean in terms of real world outcomes. So to help highlight its true value, let's explore a recent example of how one of our customers leveraged the ATC to decrease risk and significantly accelerate their time-to-solution.

A customer's call for help

Online wine services: your mileage may vary

It's after hours, my wife and kids are at soccer practice and I'm spending a quiet night at home with a glass of wine and a tax return.

I feel some answers to your obvious questions are in order:

  • Yes, I still do my own taxes. No, I don't have a good reason why.
  • Yes, I find wine and taxes go well together. Best not to review the yearly take completely sober.
  • It's a Sonoma Valley-based Cabernet I got from one of those wine curation and delivery services. Just OK, lesson learned.

Anyway, Cisco Webex Teams (a great messaging platform) lights up my phone with a notification. It's from one of WWT's regional solution architects. He and his account team are in the final phases of a solution sale and receive the dreaded, "Hey, we got a last minute competitive quote for a less expensive alternative" call from the customer. In this case, it's an unsolicited bid from a relatively unknown vendor with whom the local sales team has little experience.

Here's the actual (redacted) Teams conversation:

Webex Teams chat thread, redacted for obvious reasons

I give the solution architect a follow up call and explain to him the strengths and weaknesses of the competitive solution. I also cover the results of our RAS (reliability, availability, serviceability) testing and workload performance testing that were recently performed in the ATC. We do this kind of testing on all products before we make any customer recommendations. For this particular product, our testing of the competitive solution showed it passed our RAS tests very well, but for some specific workloads (that happened to be the ones this customer planned to run) it had some performance issues.

The solutions architect agrees this is important information for the customer to understand before making a purchasing decision, and he asks if I can get on a video call with the customer right away. At 8:20 p.m., a decision looms: keep doing taxes or get on a call to discuss technology and performance results with a customer?

Like there's any real decision to make — I'm getting on the call with the customer.

Springing into action

Within five minutes we're all on a video call, spanning several time zones and climates. (Modern collaboration technology is so cool. Iowa weather in March, not so much.)

I start off going over my background. The customer is unimpressed.

I continue by covering what we do with the ATC. Customer is still unimpressed.

I push on, explaining the product's architecture and performance limitations. The customer is interested but still not impressed.of

I then say, "Hey, would you like to see how it performs under a couple workload scenarios?" The customer is now very interested.

We have a 90-second chat about the specific storage performance test profile, the system configuration being tested (all-flash in this case) and then we have a look at the output.

First, I show that per design we simply change the read/write ratio and maintain an 8KiB I/O size. It's intended to be a very simple test.

Transfer Size

Next, I show changes in CPU and queue depth. Hmmm… things appear to be queuing.

CPU and Queue Depth

Next up: IOPS. The test profile was only trying to drive up to 10,000 IOPS — nothing for the number of SSDs in this system. But, it can't maintain even that low level of IOPS. Something's borked.

IOPS

And finally, how does this all effect latency, where the proverbial rubber meets the road? As it ends up… not in a good way.

Latency and lots of it

At best, the architecture of the vendor's system makes a bunch of SSDs perform about as poorly as 10K SAS drives. At worst, it's unusable for any kind of a transactional application, which happened to be what this customer was running.

The customer is now very grateful, interested and fairly impressed.

Providing value

For WWT

The ATC allows our sales teams to know that a given technology will be successful. It has already been tested and validated. We don't recommend technologies that don't make the cut in the ATC. This leads to better solutions, better long term customer relationships and much higher levels of confidence and trust.

For partners

While one could interpret this story as a loss for the vendor in question, it's actually quite the opposite. Two very positive things came out of this for them:

  1. One, they didn't sell a product into an environment where it would not have been successful. The last thing vendors want are large customer support and satisfaction problems, especially if they end up being expensive to remediate. Even worse, if the issue ends up getting published online and/or shared within the technology community, the vendor can end up poor perception issue. In reality, the vendor and product in question are actually quite good when applied to the right use cases. This simply wasn't their sweet spot. No reason to risk their whole reputation because of one misplaced solution. The ATC can help sort this out before it's too late.
  2. Two, we went back to the vendor and explained the issue to them, which led to calls with project management and eventually engineering. They're working on software fixes to improve performance, which we will again test in the ATC. In the end, everyone will end up with a better overall product. I should note that this is the type of relationship WWT has with many of our OEM partners. Our best partners regularly leverage our ATC facilities and engineers (a small army of them, roughly 3,000) for early hardware and software testing.
For customers

Clearly the customer dodged what could have been a very risky purchase by leveraging the ATC, but it's important to understand that this wasn't the fault of the customer or even an isolated incident for our customers in general.

The alternatives aren't very appealing:

First, the customer could purchase their own testing equipment, hire their own test engineers, work through the hardware and software evaluation agreements, wait for equipment to arrive, get training on how to properly use the product in question, work with support on implementation issues, run consistent test scenarios (this is *so much* harder than it sounds), document their results, tear it all down, re-package everything and ship it all back. Oh, and this process will take three to six months to do properly from start to finish for each product in question. Then the customer can make a well-educated purchasing decision.

Alternatively, the customer could simply rely on what they're told by the vendors, by the analysts or by other solution providers who don't have a testing and validation resource as powerful as the ATC. Maybe the purchase works, maybe it doesn't. Either way, it's a lot of organizational risk to assume.

Is running a test lab one of your company's core competencies?

We recommend simply picking up the phone and calling WWT to leverage the ATC for what it does so very well: reduce risk and accelerate your technology decisions.