Growth in data, estimated by many analysts at between 30 and 70 percent yearly, has put the capability and maturity of enterprise management on a high-visibility, rapid-growth path. We see this in the increasing awareness of needs and benefits that come from the internal service provider model and the disciplines of cross-functional policies and workflow standard operating procedures.
This process inevitably raises awareness of metrics as key change agents, and many organizations now look at completion, compliance and quality metrics as an essential component of any workflow and as primary tools to track trends and effect improvements. These worthy efforts are very much operationally and tactically focused. In an attempt to inform senior management of status and progress, the metrics are often presented in detail on a multipage PowerPoint presentation for management review. What seems like a minimum of necessary metrics can seem like endless complexity and confusion to the busy senior manager who is attempting to influence trends and directions rather than focus on specific operational activity.
As a result, the desired accolades often fail to flow from such a presentation, primarily because management inevitably sees things from a different perspective than the front-line troops engaged in day-to-day storage management. The pressures from their own managers have a different focus and a different flavor. What is it that the prudent CIO or storage director/vice president needs?
In the more sophisticated organization, there are expectations that the senior executive in charge of storage will be addressing seven key performance areas, including alignment, complexity, efficiency, service, protection, costs and people.
Developing a governance metrics infrastructure to support this focus can significantly enhance the visible achievements of the organization as well as provide a strong foundation for continuous improvement in unit total cost of ownership (TCO) of storage and business unit satisfaction. This article will review each of these areas and baseline the supporting metrics that need to roll up to provide this level of understanding. I will discuss how the metrics that make up the seven governance performance factors can be normalized and represented in a consistent and correlated format.
Let's look at what we need to begin. Getting organized for entry-level maturity and capability involves understanding which factors are key to the governance of enterprise storage and what metrics are needed to construct each of the seven basic performance factors.
Industry magazines, articles, pundits and analysts are highlighting the critical need for IT alignment with the business units. Organizations have IT relationships with business units that can range from irrelevant to impediment to enabler. How can you measure what appears to be a somewhat subjective factor? How do you demonstrate that there is empirical evidence to indicate your best efforts at aligning IT objectives with business unit objectives? I do not suggest that the following is an all-inclusive or definitive set of metrics, but it does serve as a baseline to develop what must be a situationally influenced basis to demonstrate progress in alignment within your organization.
One of the key components in this performance factor is the satisfaction survey. As a tool, this tests the business unit perceptions and translates subjective views into an empirical measure that can be used to track progress in this area. Another factor is to separate the build budget from the run budget and show the percentage and value of projects that are directly related to business initiatives and the percentage and value of the build budget related to the run budget. Other factors that might be tracked are the ratio of business analysts to business units. It is not the metric value that is important, but rather the direction that this metric is moving.
Gartner and Forrester suggest that the major component of storage TCO is the cost associated with administration. This is seen as being in the 60 to 70 percent range of the TCO. Complexity drives administrative overhead, people and skills needed to deal with complexity and its outcomes reflected in alerts and escalations. Can simplicity or the trend to simplicity be measured in empirical terms? Some of the following metrics can help develop an understanding and even a calculation of a simplicity factor. Multiple technologies create complexity, so understanding and tracking how many storage technologies, tape technologies and storage resource management (SRM) tools are in use can be one key component. Understanding the complexity of the environment is another area that can be formally measured by understanding the infrastructure, including the average or mean number of ports per fabric.
Efficiency is a measure of the organization's ability to execute tasks and respond to problems within targets. This performance factor would include mean time to provision, mean time to problem resolution and number of alerts per fabric. An interesting aspect of this factor would be the construction of a metric to indicate the degree of automation. This might be done by identifying key standard operating procedures (SOPs) and measuring what percentage of those key SOPs are executed through automation. Here is a classic example of the push/pull that occurs. Adding SRM tools may increase complexity, yet actually improve efficiency.
Service is a measure of business-facing factors that demonstrate the enterprise storage organization's ability to meet business-driven needs and expectations. Consolidating several key business-facing factors into a service metric provides a capability to demonstrate and track the ability to service the business, perhaps even enable the business. Factors making up the business-facing service score would include availability, number of outages, time to provision and recovery success rate.
This factor represents how well the organization is meeting the corporate objectives of data protection. This key performance factor will no doubt be closely viewed by internal and external audit staff, among others. Factors composing this metric include backup completion success rate, recovery test success rate and similar data for archiving and data recovery. In addition, the number and type of security events should be tracked to provide a "threat level" factor. Time to resolve a security event might also be tracked and include virus containment, intrusion containment and denial of service containment.
This performance factor is probably best understood and is often under the most scrutiny. To justify costs and gain appropriate funding, it is important to relate costs directly to business-driven needs. This means a cost model must be in place for each service in the catalog, assuming an internal service provider model. To best understand costs, such a model is almost mandatory. The catalog of services would typically include tiers of services for production storage and the services required to protect that storage, including backup, archiving, disaster recovery and security. Unit TCO of a gigabyte of storage in each tier is a key component, showing the split between hardware, software, facilities and administration-related costs. More sophisticated use of this factor can track percentage of cost recovery through cross-charge or billing to business units. An interesting component is to track and trend the ratio between hardware, software, facilities and administration costs.
With administration costs consuming the lion's share of the storage budget, understanding the makeup, capabilities and motivation of the people who actually make it all happen supports this factor. Obviously headcount and salary costs are important, but so is the ability to identify key human dependencies. How many tasks have secondary and tertiary staff identified and trained? What is the training backlog? What is the correlation between skill sets and technologies required? How many weeks of training were completed in the period against target? Most importantly, getting a handle on the subjective issues of morale and motivation can best be achieved by a climate survey. Similar to an external satisfaction survey, the results can provide feedback on how well the various people management targets are being met.
Maturing the Governance Metrics
Studies indicate that a graphic representation of a situation is easier to understand than the raw data behind it. An enterprise storage management dashboard should be graphic in nature and should emphasize past trends and projected direction rather than simply provide a static picture.
This means the operational reporting metrics that make up the executive dashboard must go through a normalizing process to enable their presentation in a consistent manner. It is desirable to present graphics that show, for example, that "up and to the right" is the desired direction for all performance indicators. This approach highlights any divergence from the expectation or target.
In addition, it would be nice if the data itself could be normalized so that the scale of each of these metrics was similar. That is, a steep curve would indicate the same level of progress for each performance indicator. To achieve this, it may be necessary to develop a simple scoring mechanism using predefined targets or thresholds that allow each performance indicator to be normalized, and perhaps even weighted, allowing a representation on a simple scale of high, medium, low for each performance indicator.
It is true that presenting the initial dashboard in this level of sophistication may be a stretch, but in the long term, this is the direction that will provide management with the most facile view of achievements and trends in the enterprise storage environment, the performance indicators that drive unit TCO reduction and improved business unit satisfaction.
Finally, each of the seven performance factors will have a certain degree of interdependency and correlation. Complexity can impact efficiency; service factors can increase costs. Once the basic metrics start to be visible, a higher level of understanding cause and effect becomes apparent as correlation and interdependencies become visible. This degree of sophistication represents the highest level of maturity and capability of a governance dashboard supporting enterprise storage governance.
Making a start today involves a decision on what metrics will be collected and tracked to provide the initial status and trending data for each performance factor. Perhaps only one metric in each performance factor is a good starting point. Once the process starts, trends are revealed and the focus emerges that will facilitate innovative team thinking about the correlation and interdependencies of factors that can drive maturity and reduce unit TCO of storage while improving business unit satisfaction.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access