by Mark Haranas

World Wide Technology is a $20 billion IT powerhouse helping customers create and implement artificial intelligence solutions across the globe.

"I believe AI is going to be the most transformative technology that's impacted mankind in our history," said WWT CEO Jim Kavanaugh, who's led the 10,500-employee strong company for over 30 years.

The AI customer and sales opportunities in 2024 for WWT and partners to transform customers is a game changer, Kavanaugh tells CRN. "Everybody's talking about AI, but everybody is also trying to figure it out," he said. "It's happening so fast and so many new technologies and new capabilities are launching that customers are overwhelmed."

"I see a ton of opportunity for World Wide that we're engaged with customers today in helping them try to understand where they want to go relative to how they want to build their AI infrastructure. We help them as we have built out purpose-built AI platforms in our labs running on Nvidia, AMD, Intel," he said. "We're looking at, 'What is best for that customer? What kind of back-end storage capability is best for them? What type of networking capability should be built-in? … What makes the most sense for that client?"

Kavanaugh said many customers are specifically looking for generative AI (GenAI) solutions but don't know exactly how to start.


WWT's $500 Million AI Bet

To meet this urgent customer demand and sales opportunity head-on, WWT last month unveiled it plans to invest more than $500 million on building out its AI services as well as a new AI lab over the next three years.

The $500 million investment will boost AI training and WWT's Advanced Technology Center designed to help customers rapidly build and implement new technologies, such as GenAI solutions. WWT will also build a new AI lab to serve as an AI proving ground to help companies understand the technologies and network infrastructure needed to truly leverage AI in their operations and tailor it to specific business needs.

In an interview with CRN, Kavanaugh explains "huge" customer opportunities in AI this year, Microsoft's AI vision and if AI-based solutions will mostly be in the cloud or on-premises.

"Microsoft and [CEO] Satya Nadella have done a pretty amazing job getting out in front of the world of AI," said Kavanaugh.


On the AI cloud front, what do you think of Google and Microsoft's strategy?

Microsoft and [CEO] Satya Nadella have done a pretty amazing job getting out in front of the world of AI.

Look at what they have done with their partnership with OpenAI and ChatGPT. You look at the ability to monetize some of that OpenAI investment and the massive investments they've made in regards to their own software and copilot in a proliferation and monetization—they're doing a great job.

When I look at Google, they have so many capabilities and these large language models. But Google is primarily focused on the business-to-consumer side, and not necessarily as focused and as deep and broad with the expertise that Microsoft has on the business-to-business side.

How these platforms play out is going to be very, very interesting to watch.

You've got some obviously very significant players with deep pockets that see this as a long-term game and long-term investment. It's going to be a very iterative opportunity. It's not all fully baked yet. Everybody is learning on the fly.

 

Are you seeing demand globally for AI?

It's definitely global. It's very much a global opportunity.

This is not something that is a domestic opportunity. Every customer that we deal with—whether domestic in the US or in Europe, Asia, Middle East—everybody's talking about it.


Do you think AI will be mostly cloud- or premise-based?

Personally, I believe that it's going to be a hybrid structure.

Smaller and midsize organizations, most likely it will be done in the cloud.

Some of the larger enterprises are looking at more of a hybrid approach of some purpose-built AI platforms internally, and then some that may be connected to the cloud.

Then you get into helping clients try to figure out, 'What large language models should they be using? Or what smaller language models should they be building, modifying and training that are more specific to their datasets and to their specific environment?'

From a World Wide standpoint, this is part of the opportunity that we have: to help educate customers in regards to, 'What's real? What's here today? How fast is it changing? What makes most sense, both on the AI infrastructure side and the business outcome side for them?'

 

Read full article