In this article

Now, more than ever, the public sector needs to deploy data anytime, anywhere — whether on-premises, in a private cloud or in the public cloud. A data-centered approach to digital modernization is needed to provide flexibility for changing work environments while managing costs and keeping data available, secure and accessible.  

During our Public Sector Tech Talk E02: Make a New World Happen With Data episode, Jim Cosby, Deputy Chief Technology Officer of Federal Civilian Accounts and Public Sector Partners at NetApp, explained that government agencies are facing multiple challenges in terms of both data management and cloud migration efforts. Two of those challenges are trying to understand what data exists within agencies coupled with the ever-changing challenge of IT modernization.

"Agencies have been trying to modernize technology that has been used for 20 to 30 years or more," said Cosby. "Governments have made a lot of improvements, upgrades and refreshes in technology, but there's still more to be done."

Cosby also stated that other challenges faced include cyber protection against constantly evolving threats and artificial intelligence (AI). Agencies must leverage the modernization of data to obtain actionable intelligence. By using AI, agencies can better understand and analyze agency data to propel government services forward — from healthcare to the military, and across other federal agencies and state and local governments. The good news is that technology can connect the dots between all data that government agencies have and enable agencies to use that knowledge to provide better citizen services.  

The journey to digital modernization

A data fabric can create more efficient access and movement of data for the public sector in their journey to digital modernization. This architecture and set of services can provide consistent capabilities across multiple endpoints ranging from on-premises to multiple cloud environments. It can also accelerate digital transformation by delivering consistent and integrated hybrid cloud data services to provide data visibility and insights and access and control, as well as protection and security.

According to Cosby, a data fabric approach also provides cyber protection by allowing data to be encrypted at rest and during transit, regardless of where it is — whether at a remote site, core data center, brick and mortar center, or a hybrid location that is talking to the cloud.  

Data, data, everywhere

Governments need to protect their data and provide visibility, control, scalability and resilience while enabling access to the right people, at the right time.  Multiple environments can be extremely challenging to organize, manage and secure for those agencies and localities trying to manage and control data across multiple locations and environments, Cosby stated that it is important to implement these recommendations:

  1. Inventory Data – Data gravity is a common challenge that government agencies face in terms of both data management and cloud migration efforts. Before data can be used, agencies must find out more information about existing data including what kind of data exists, where it's stored, where it can reside for best access and what types of applications work best with it. At the same time, agencies must look at the applications that are touching the data and accessing it. Is it file-based? Does it involve AI, high-performance computing or big data? Government agencies need to pull the intelligence out of their data to help improve operations and citizen services.
  2. Assess Data – After learning what an agency has and where it needs to be, it is important to have the right data in the right place to get the best performance and security. Make sure applications are executed on time. Understanding what an agency has and the needs of its users and applications will help develop classes or tiers of best practices of where to have that data running the most efficiently, while providing security at the same time.
  3. Secure Data – Agencies must constantly keep increasing the layers of security to mitigate the risks that are posed by cyberattacks like the SolarWinds hack or viruses like WannaCry or Darkside ransomware, which was responsible for the Colonial Pipeline compromise. Are you encrypting data at the edge, core, in the cloud or in all places? Do you encrypt it when you move it around? This step is important to mitigating vulnerabilities.

Building a data fabric

Federal agencies and state and local governments need a data service that offers flexibility and storage efficiency. This data service capability must allow the public sector to provide access to data at any location, for any application and any user in a robust and flexible manner. Federal agencies and state and local governments need to process data faster, while realizing cost savings. They also need storage efficiency that includes deduplication, compression and compaction of data.

Government data and applications need to be in the right place at the right time with the right characteristics and capabilities to achieve new insights and accelerate innovation. 

To listen to the full episode, tune into this Public Sector Tech Talk E02: Make A New World Happen with Data, hosted by WWT and NetApp.

Public Sector Tech Talk E02: Make A New World Happen With Data LISTEN NOW