by Earl Dodd

I've been around long enough to remember when high-performance computing (HPC) was something only scientists, researchers, and certain government agencies cared about. While some of us were enjoying the relative peace and stability of the late 1980s, or (gasp) were not even born yet, some of the world's most influential organizations were developing technology that has fueled decades of scientific progress. Now couple our ability to collect more data, the need for more computational horsepower to analyze that data grows.

This post explores why HPC is leveraged as best practices for digital transformation initiatives. Companies and governments need to transform, or they will be left behind in this digital world. Data is critical to any initiative. Digital transformation requires that you search for this data so you can put it to use. Intelligent capture, digestion, and data processing technologies using HPC and AI systems underpin the ability to advance processes from mere digitization to real transformation. I call this "fusion computing."

An HPC + AI + Big Data architecture is the foundation of the fusion computing services framework. This framework represents a convergence of the HPC and the data-driven AI communities, as they are arguably running similar data- and compute-intensive workflows. Just as data security and business models like the cloud (public, private, and hybrid) have evolved, the high-performance architecture for HPC + AI + Big Data has evolved to meet the agile demands of business and governments.

The important fusion computing takeaways are:

  1. an open high-performance architecture drives agility, faster innovation, and just-in-time procurement;
  2. data is the new currency and can be shared more easily with mainstream workflows;
  3. the framework provides a fully orchestrated, extensible, and traceable experience; and
  4. the HPC cloud enables "as a service" economy for business and government.
Read full article