Reduce Your IT Portfolio Footprint While Driving Toward a Cost-Effective Organization
In This Article
One of the largest cost drivers for organizations is internal IT portfolio spend, which can develop inefficiencies over time without consistent scrutiny and proper governance. Once these inefficiencies have taken root within an organization, it can be incredibly difficult to resolve this challenge at scale, while also transforming culture and decision-making to prevent furthering the situation. WWT has worked to drive application rationalization initiatives for our customers by following an industry best practice approach to aggregating specific sets of critical IT data.
This approach supports the generation of insights needed to identify opportunities for both rationalization and consolidation, as well as planning and execution on those validated opportunities. These types of projects consist of establishing the base data model (“Data Cube”), a Total Cost of Ownership (TCO) model and the associated process and governance to continuously identify rationalization opportunities across all business and technical capability areas. Following this process enables the development of the foundation for becoming a more cost-effective organization, as understanding the costs associated with applications and infrastructure can inform many IT-related decisions.
The transparency created through these initiatives can also start you down the path of utilizing a charge-back model between your business and IT, where individual cost centers can be charged for IT services, based on usage and activity. As such, this type of initiative cannot happen in a vacuum. It is imperative to collaborate with enterprise data owners to identify, quantify and address key data gaps necessary for data-driven rationalization.
Application rationalization process
WWT generally seeks to focus on integrating existing application, infrastructure and financial data to build the baseline Data Cube. This Data Cube is then leveraged to support the development of a TCO model built in alignment with the customer’s environment to continually quantify application rationalization opportunities as they are identified. Additionally, short-term value can be obtained by identifying other (non-app rationalization) data-cleanup opportunities uncovered by the Data Cube. The establishment of this structured rationalization process and a defined governance framework to provide oversight and alignment toward short- and long-term organizational strategies is critical to the success of the application rationalization initiative, which spans the organization.
Foundational data development
Foundational data development consists of four crucial steps to setting up the baseline data used throughout the application rationalization process: data source identification, data aggregation, data gap analysis and Data Cube development. During this stage, it is important for governance processes to begin to be put in place to aid in immediate data quality remediation efforts. Key artifacts of this stage include the development of the Data Cube and the TCO model, which are both highly dependent on data accuracy and freshness.
Data source identification
The initial step for developing the data foundation for application rationalization is to identify key cross-domain sources of data from around the organization, related to applications, infrastructure and their associated financials. Using a standard, best-practice data dictionary, data sources are identified to capture all fields listed in the dictionary. Following this identification, access is requested to all data sources, and a timeline is built to estimate the length and effort required for data collection and aggregation.
At this point, any missing data fields may need to be sourced through application questionnaires or via application dependency mapping to help identify and validate application-to-application, application-to-infrastructure and infrastructure-to-infrastructure data. Throughout this process, governance teams maintain documentation on identified data sources.
Once all identified data have been located and access has been granted, data sources are stitched together to provide a holistic view of the application environment. Data cleanup efforts include initial purging of unneeded or irrelevant data (e.g., confirmed decommissioned servers and applications) and normalization of data where necessary. Once initial cleanup is completed, each dimension of the data cube is populated with cross-dimensional facts being used for linking disparate data sources (e.g., app ID, server ID, business unit). Documentation is created for final data sources leveraged during data extraction as well as for any mappings or fuzzy matching necessary across data sources.
Data gap analysis
Once aggregation of data is completed, the team assesses what data gaps exist against the full set of data that typically informs a complete application rationalization exercise and identifies the assumptions needed to fill those gaps. Additionally, data quality issues for existing data are identified, documented and plans are developed for short- and long-term remediation efforts. Where possible, remediation of small data gaps, assumptions for larger data gaps, and interviews with application owners are undertaken to clean up data issues. To keep track of data quality issues and remediation efforts, governance teams typically kickstart parallel efforts for data cleanup and develop a reporting mechanism for data freshness, completeness and accuracy.
Data Cube development
The Data Cube is a platform comprised of aggregated cross-dimensional data sources, spanning infrastructure, application and financial data.
Data to aid in understanding information around hosts, VMs, storage and associated costs.
Data to aid in identifying all apps in an environment and details around purpose, usage, costs, etc.
Data to aid in quantifying an application’s total cost of ownership across app, infrastructure and labor costs.
The Data Cube is the primary tool used to identify rationalization opportunities and aid in the decision-making process for application/infrastructure simplification initiatives. The insights generated from the Cube enable the organization to focus efforts in the areas of highest opportunity. As such, application owners should prioritize ensuring high-quality data is present and is as up to date as possible, within source systems.
Benefits of data cube
- Centralized view of application components, usage and financials.
- Source of truth for application management.
- Enables the ability to reduce cost based on application, technology and infrastructure consolidation identification.
- Improved business process support through identification of capability optimization opportunities.
- Basis for TCO model.
Key consideration of data cube development
- Discovering and documenting links between data cube dimensions enables rationalization analytics.
- Documentation of data sources used for specific fields is essential for developing a repeatable process, as well as gaps needed for future insights.
Some fields may require assumption development which should be documented and updated as rationalization activities occur. Additionally, data in the Data Cube should be continuously refreshed and validated against source data for integrity.
Rationalization opportunity identification
The identification of rationalization opportunities is performed through the analysis of the newly developed Data Cube, based on IT portfolio data. The two primary activities carried out during this phase are the discovery of quick wins (e.g., consolidation of duplicative licenses) and the initial analysis of enterprise-wide opportunities. Key artifacts here are agreed upon using an Opportunity Framework, which automates the scoring of applications to determine whether they should be further explored for rationalization, and the development of a Rationalization Dashboard, which is created to visualize potential opportunities and underlying metadata that allow teams to manage rationalization activities. Governance processes aid in ensuring the framework is consistently up to date and support the vision of the organization through the identification of correct rationalization opportunities.
Quick win discovery
Initial analysis of existing data allows the team to best understand the landscape, while also providing the opportunity to identify quick wins to be able to demonstrate immediate value. While doing this type of analysis, we generally look for instances of applications with duplicate licenses (e.g., multiple teams paying for separate Tableau contracts that could be consolidated), or situations where decommissioned applications are still generating production costs. We also use this time to identify data anomalies (e.g., erroneous purchase order data for a technology that may have been keyed in USD vs Euro or has too many zeroes). This is work that can be done by a centralized team, without additional enterprise support, but application owners may be involved as needed.
While performing this analysis, it is crucial to maintain logic for identifying these wins (to ensure perceived cost savings can be realized).
Holistic opportunity analysis
Once the team has familiarity with the landscape of customer IT portfolio data, deeper dives into key focus areas can begin. Our recommendation is to separate out deep dives by at least a level 3 capability; this generally balances ensuring granularity to identify consolidation opportunities, but not too deep to the point that analysis paralysis sets in. As mentioned earlier, WWT will also spend time upfront to develop a rationalization dashboard, based on the aforementioned Data Cube, to support this analysis by providing a variety of valuable visualizations, including our in-house rationalization scoring rubric.
At this point, collaboration with application owners becomes increasingly more important, as they will provide the necessary background and context to understand the true requirements a specific technology or business solution provides; just because something looks like an opportunity does not necessarily mean it can just be rationalized, there are impacts to end users that need to be taken into account (e.g., the CEO and COO are the only two users of a PowerBI dashboard — is there an alternative solution available for them?)
Rationalization validation, planning and execution
The final stage is driven by two critical steps: an opportunity portfolio deep dive and the execution of the validated opportunity sets. During this stage of the process, application owners aid in validating opportunities for rationalization, while internal teams develop roadmaps and sets of rationalization opportunities for eventual execution. Governance processes aid in opportunity prioritization based on key criteria to be met after rationalization occurs (e.g., cost savings, cost avoidance). Key artifacts include an iterative execution roadmap and an agreed-upon prioritization framework.
Opportunity portfolio deep dive
Once a holistic list of application rationalization opportunities has been identified, opportunities are prioritized based on organization-specific execution drivers. Typically, prioritization criteria revolve around three key areas:
- Time to Achieve: An opportunity complexity scoring matrix is defined to understand the level of effort and cost associated with executing a rationalization opportunity. Each opportunity is bucketed into one of three categories of complexity (high, medium or low), which correlate to a specified time to achieve (i.e., time to execute an opportunity).
- TCO Savings: Leveraging the financial data within the data cube and financial assumptions built in collaboration with the organization, current and future total cost of ownership (TCO) is calculated for each opportunity across application, infrastructure and labor costs. The difference between current and future TCO establishes the cost savings (or cost avoidance if the cost has not been budgeted for) of each opportunity.
- Cost to Achieve: Cost to achieve is the summation of labor costs against the time to execute each opportunity. Once cost savings and cost to achieve are calculated, the total financial impact of the opportunity can be assessed.
Leveraging Time to Achieve, TCO Savings and Cost to Achieve together, the net present value (NPV) of each opportunity can be calculated and used to rank the priority of each opportunity. Based on NPV ranking, resource availability and the organization’s near- and long-term goals, a detailed rationalization opportunity roadmap is developed.
To successfully execute the rationalization opportunity roadmap, an organization-specific execution governance structure is established. The governance structure must include cross-domain stakeholders (e.g., application owners, technology experts, demand management) to ensure proper alignment and expertise across the variety of resources needed to successfully rationalize applications. Responsibilities can include:
- Making go or no-go decisions on whether an opportunity becomes a project (e.g., potential no-go decision if the opportunity has a negative NPV).
- Developing opportunities into formal projects.
- Balancing resource constraints based on other organizational priorities and needs.
- Tracking execution success (e.g., actual financial impact versus expected financial impact).
By performing consistent application rationalization activities, you can ensure your IT environment leverages consistent tooling, creates complete transparency in IT utilization and keeps costs to a minimum. While IT portfolio cost efficiency begins with application rationalization, it does not need to stop there.
A similar approach to application rationalization can be taken for decommissioning and consolidating underlying infrastructure that may support rationalized technologies or applications, may be nearing or past its end of life, or is simply being underutilized. Additionally, incorporating an analysis of technical debt being generated by applications and technologies will provide a different view of potential candidates for consolidation or decommissioning, resulting in another view of your IT landscape.
Don’t let analysis paralysis prevent you from simplifying your IT portfolio — inefficiencies will only compound over time.