The Challenges of Edge Computing Deployments and How to Address Them
In This Article
Computing is all around us, in so-called edge computing deployments that are hiding in plain sight. What isn't all around us is the support required to protect and manage all these environments. Consider a large restaurant chain that, like many of its competitors, now has in-store kiosks for customers to enter orders along with an app to enable remote ordering. Supporting those apps requires a server in each location, which presents several issues.
First, where does one install that server? Possibly in the manager's office? That's better than sitting next to the fryolator, but predictably, managers aren't all that happy about having a piece of IT equipment near them when they're trying to get work done. IT equipment is often noisy and takes up space, which is already in shortage.
What's more, an office in a restaurant with staff coming and going all day isn't a proper environment for a sensitive piece of IT equipment like a server, nor the accompanying uninterruptible power supply (UPS) that's required to ensure the equipment keeps running when the power goes out. Restaurant cooks, waitstaff and managers don't generally know how to nor do they have time to maintain, check the status of the UPS or troubleshoot other IT issues.
This is the reality today for many distributed companies as they're required to provide edge computing deployments. Common examples include:
- Fortune 500 firms with Internet of Things (IoT) initiatives that involve devices and sensors generating vast amounts of data.
- State and local government and educational institutions implementing smart city technology, wired classrooms and remote learning initiatives.
- Cellular service providers rolling out 5G networks that are many times faster than previous generations and able to transport far more data.
- Retailers with smart technology in stores such as kiosks and smart mirrors.
All of these applications require localized processing power, driving the need for edge computing deployments. IoT, ordering kiosks, smart city applications and more require fast, low-latency connections to compute resources. These resources must be nearby, driving the explosion of edge data centers.
The edge data center market is forecast to grow by $5.9 billion from 2020 to 2024, at a compound annual growth rate of almost 14 percent. Estimates for the growth in the number of IoT-connected devices are equally aggressive, with prognosticators putting the number at 21.5 billion by 2025 and 125 billion by 2030.
As the restaurant example shows, delivering effective edge solutions to hundreds or thousands of locations comes with a number of challenges.
Space is a significant one in many instances. Restaurants, retail stores, K-12 schools and other locations often do not have dedicated space for IT equipment. So IT gear ends up in areas that aren't suitable for it: closets, offices, sheds or even in in open areas like hallways or corners.
Integration of all the components is another issue. Edge deployments require not just a server, but storage and networking equipment. Hyperconverged infrastructure (HCI) that combines server, storage and networking into a single solution help with the physical challenges, but someone still has to ensure all the components work together as intended, requiring experienced IT staff.
Given the edge deployment may be processing and storing sensitive corporate data, including financial and personal data, security is also a concern just as it is in a centralized data center. That means the prevention of unauthorized physical access to the equipment as well as protecting it from cyber attacks is necessary. An organization also needs a way to remotely monitor the equipment so issues can be identified, alerts can be sent and remediations can be made.
All IT equipment requires appropriate supporting infrastructure, including proper power and cooling. This often includes a UPS to keep equipment running in case of power disruptions, and possibly additional cooling. Racks and enclosures are also needed to keep equipment safe from environmental and security risks.
Finally, there's staging and delivery to consider. It takes experienced IT personnel to connect all the required components and load software, to physically install the equipment in a cabinet or enclosure and to properly pack and ship it. Once on-site, you again need experienced staff to install the equipment, validate remote connectivity and possibly remediate any issues.
Addressing each of these challenges requires the successful deployment and operation of edge data centers.
In the HCI world alone, you'll find plenty of solution options for the compute, storage and networking components of your edge deployment. Vendors including Cisco, Dell EMC, HPE and Nutanix have appliances that combine all three components. Software vendors including Nutanix and VMware have reference architectures that detail how their software works with a range of hardware platforms.
Deciding what's best for a set of applications requires thorough testing of the exact hardware/software combination required. This means performing proof of concept tests with realistic application workloads.
An edge data center is likely to be located in a space that was not originally intended for IT equipment, so it needs to be protected from whatever may come its way, from spilled coffee to accidental bumps and kicks. Protection is provided by a proper enclosure, one that's suited for its environment. A micro data center (MDC) approach is likely a good option here. A MDC is an enclosure that includes the integrated power and cooling that are essential to proper data center operation.
MDCs come in different models to meet varying requirements. For retail or office locations, there are models that look like office furniture to blend in with the environment. Commercial and industrial locations, including restaurants and plant floors, may require models built to withstand dust, moisture and other environmental hazards. In some instances, a fully ruggedized enclosure may be necessary, such as for outdoor installations.
MDCs often have integrated physical security capabilities to keep any unauthorized individuals from gaining physical access to the IT equipment and doing anything negatively impactful.
An edge data center also needs to be monitored for environmental conditions such as temperature, moisture and movement. Cameras and sensors are required to monitor for these conditions and report back to a centralized location where facilities staff can take action.
Similarly, the IT hardware and software, along with UPSs, power sources and cooling, all needs to be monitored for proper functionality. Centralized on-premises and cloud-based services are available that use artificial intelligence capabilities to perform predictive maintenance, alerting operators when issues are expected but before they cause a failure. This also prevents the unnecessary scheduled dispatch of technicians, saving time and money.
For most organizations, leveraging services provided by third parties is going to be crucial to deploying and operating edge data centers. That's because few organizations have the in-house staff, expertise and infrastructure required to handle all the functions, including:
- Assessing infrastructure options and conducting thorough proof-of-concept (POC) tests.
- Ordering, staging and assembling infrastructure.
- Shipping finished edge infrastructure to remote sites.
- Installing infrastructure at each site.
- Conducting assurance testing.
- Dealing with ongoing monitoring and management of edge facilities.
Some of those functions, including the POC testing, staging and assembly, require space data center equipment, space and tooling in addition to available IT staff and expertise.
Several years ago, we recognized most customers didn't have spare data center space or IT cycles to dedicate to such product evaluation and testing. Customers were coming to us for advice on the best combinations of technology for their various applications.
In response, we built the Advanced Technology Center (ATC), which is made up of complete data centers (currently three of them with a fourth on the way) used only for testing various hardware/software combinations. In the ATC, we can determine which HCI combination is going to be the best bet for your specific application by testing them with actual production loads. There's even a facility collocated with an Equinix site to test hybrid cloud/edge solutions.
WWT also formed partnerships with providers including APC by Schneider Electric, Cisco and Dell who have validated various HCI combinations to work with a range of APC enclosures, including micro data center offerings.
We can also leverage our Integration Centers to perform physical and logical integration of cabinets, cables, servers and networking, all packaged up for safe shipping and deployment. Once the equipment is on-site, we'll perform the final installation and conduct final validation testing, including the validation of visibility with cloud-based remote monitoring and management.
The restaurant example from the beginning of this article is more than a fictitious example created for this article; it's a real-world project that WWT is actively working on.
- WWT started with assessments and workshops to help the company understand its issues and architect a solution.
- WWT then tested the proposed solution in our ATC.
- WWT is now working with the customer on the implementation, including the delivery of complete racks and lifecycle management, with no IT staff required per restaurant.
With more than 2,000 restaurants to manage, this customer is a great example of how partnering with WWT can lead to success.