Energy Infrastructure Company Gains Data Visibility Through File Analysis
WWT assesses a customer's data usage patterns and file attributes to help them optimize the utilization of storage resources, realize cost savings and develop processes to audit consumption.
In this case study
An owner of one of the largest energy networks in North America was managing a large file repository that had been accumulating two decades worth of data on their network-attached storage (NAS) platform.
Why did they have so much legacy data?
Every time the organization needed more storage capacity, or whenever hardware expired, the company's IT team would purchase additional storage and move all the data, regardless of current relevance or use.
With goals to reduce storage capacity and costs as well as develop processes to audit consumption, the energy infrastructure company knew that they needed to clean up their data and identify active and dormant users and applications before any migration could occur. However, they didn't have the capability to capture the necessary information to determine which files should remain on the primary storage, which should be safely downgraded to lower storage tiers, and which should be considered for deletion.
To perform the data file audit needed to meet their goals, the company partnered with WWT to discover, classify and analyze data usage patterns and file attributes and provide recommendations on how to move forward with the information.
In order to gain a better understanding of the customer's existing operational practices and storage services, we started with the discovery process, which would help develop short- and long-term strategies based on the customer's current state assessment and future state vision. As part of this process, we conducted interviews with the storage team to identify challenges, opportunities, and a list of associated services or applications that would be the focus of the assessment and determine the scope of directory structures to be evaluated.
We then used the Datavoss Rapid File Analysis (RFA) engine, which ingested the production data and extracted it into meaningful metrics to identify the top use cases across five selected mount points on the production arrays. The data collected was profiled by age, size and owner, allowing us to determine the top 20 active users/applications of data and the top 20 users/applications that have not been active in the past five years.
We shared insights from the assessment with the client in a matter of weeks as opposed to months if the client had attempted to do the analysis themselves. The insights were also presented on a granular level that the customer would not have been able to achieve without the Datavoss RFA engine, giving them better visibility of their file environment's structure.
The findings from the assessment will allow the client to determine retention and tiering policies that help optimize the use of storage resources and increase efficiencies. By better defining and implementing these policies, the client not only reduces their file footprint but will also mitigate their exposure to potential cyber-attacks.
In addition to the data analysis, we provided pricing evaluations for keeping the client's archive data on-premises as well as moving it to less costly on-premises tiers and the public cloud. Implementing these tiering recommendations would further reduce the usage of their primary storage and realize cost savings.
Once the client completes the initial cleanup of the production data, they can duplicate this practice for the rest of their environment as well as other types of storage.
Looking to gain better visibility of your existing environment?