Findings from a survey of IT professionals conducted by ONStor, a provider of NAS solutions:

 

  • At their current data growth rate, 43 percent of respondents could stay in their current infrastructure for only six months to one year if they changed nothing.

  • Twenty-four percent reported that the cost and time of building another data center is the most serious issue driving the reduction of data infrastructure power consumption.1

It’s official - green matters. The number one strategic technology for 2008 is Green IT according to Gartner.2 Gartner designates a technology as strategic because implementing it will make a significant difference in an organization’s work. Conversely, not implementing a strategic technology carries a competitive risk.

 

In any case, green computing can no longer be ignored and is already having an impact on how IT groups budget and implement.

 

Green computing is confronting many of our preconceived ideas around the physical attributes of the data center. On the one hand, we read articles about McData centers the size of airplane hangars; on the other, we see a struggle to run leaner, thinner and cooler data centers. The challenge is finding a means to balance the astronomical growth of data with the prospect of shrinking physical resources. The data warehouse appliance (DWA) is emerging as one means to achieve this balance.

 

When the DWA landed on the scene, it certainly displaced many commonplace notions about servers and storage. It challenged the division of roles where some machines were tasked with active computing and others with passive storing. “And never the twain shall meet” in order to leverage efficiencies of each type. The DWA, by its very definition, combines processing and storage in one unit to address the challenges of working with big data. Perhaps not designed to be a green machine, it was in fact engineered to use resources of space and processing capacity extremely efficiently.

 

Almost all hardware vendors (including DWA vendors) post information about the energy usage of their products. Some, like Dell, even publish the results of lab tests (by Principled Technologies) that put their PowerEdge server through a series of computing loads so that energy-conscious buyers have a detailed picture of power consumption.

 

Some manufacturers have emphasized cooling, the adjunct to power usage - HP’s Liquid Cooling technology comes to mind, but other approaches such as fans, venting and physical placement of component work to reduce the thermal output or the power required to dissipate it. Much less attention is paid to physical size and the impact it has on a machine’s greenness but size does matter when calculating the cost of floor space and new construction.

 

It’s when you look at these three aspects together-power consumption, thermal output and physical size - that you get a true sense of a machine’s environmental footprint. These can be brought together to calculate a kind of computing density, if you will. Computing density is a useful tool for quantifying the environmental cost of processing requirements.

 

 

In a data warehousing or very large database environment, it’s most useful to look at computing density required per terabyte of data. The Principled Technologies test measures consumption while processing a mere 25MB of data, which doesn’t begin to cover the workload data warehousing infrastructure must bear.

 

At a minimum all DWAs, by virtue of combining server and storage, offer these energy efficiencies:

 

  • Eliminating the redundancy that results from having servers dedicated to data processing and servers dedicated to data storage. There are generally fewer CPUs in play, fewer fans, etc.

  • Reducing load on network infrastructure as they process data near the point of storage. Moving data around the network uses large amounts of power. Communications equipment has the highest rate of energy consumption per square foot. Switches and fibre channel alone can account for 14 percent of a data center’s energy consumption.3 Some DWAs use fibre channel for inter-node or inter-appliance communication, however even they are conservative consumers compared to SAN storage devices.

  • Implementing Massively Parallel Processing (MPP) spreads work across CPUs and disks in such a way that lessens redundant spin or idle cycles.

In addition to these efficiencies that are inherent in the appliance form factor, there are other techniques that vendors can use to further conserve energy, for example, selecting high-efficiency processors. AMD’s HE Opteron processors use 58 percent (95 watts cf. 55 watts for HE processors) of what their “standard” processors use. SATA drives and Flash memory, as well as more energy efficient RAID controllers and switches can contribute incrementally too.

 

 

Cooling: Less power usage = less thermal ouput = less cooling = less power usage.

Take all the efficiencies listed above and double their energy savings. General rule of thumb: it takes as much energy to cool a device as the device uses. Some of the energy consumption specs do not account for this part of the equation. Because heat dissipation is a side-effect of operating the device, the energy usage occurs on the facilities side of the house - costs of cooling the data center generally or providing targeted ducts or vents for machines that run warm.

 

More and more organizations are rationalizing the ecological impact of their operations as a whole. In order to account for the data center’s share of total resource usage, calculations have to include power used by hardware and for ambient and supplemental cooling.

 

Figure 3 shows the watt hours consumed by two physical data warehouse architectures under an average processing load: a host server for the DBMS with a storage server with 1.5TB capacity and a DWA with 2TB capacity.

 

 

The difference is significant and underscores the advantages that can be gained from using devices that have been engineered for specific tasks.

 

Small Footprint Conserves Space

 

Some of the same factors that play a role in a DWA’s lower energy consumption account for a DWA’s efficient use of space, the third dimension of hardware’s environmental impact. A single appliance as opposed to two pieces of independent hardware eliminates redundancy and is more compact.

 

The DWA is however not just a “smaller” configuration - it is uniquely engineered to provide massive capacity within the smaller footprint. All DWAs have this spatial efficiency in common, although each differs in how it is achieved. Some DWAs are a combination of third-party hardware preconfigured with specialized software. While still very energy-efficient, because these DWAs use general-purpose hardware, their physical footprints can approach that of a nonappliance server-and-storage stack.

 

To get the complete picture of a DWA’s environmental profile, its physical dimensions must be taken into account. Larger machines require more space; more space costs more money and has a larger ecological impact.

 

 

  • For 2TB of data, a DWA’s environmental footprint is approximately 40 percent that of a server+SAN architecture. That ratio holds steady even as the data warehouse expands in volume.

According to EYP Mission Critical Facilities, hardware gear – storage arrays, servers and networking equipment – accounts for 50 percent of the power requirements in the data center.4 It follows that energy-efficiency practices aimed at significantly easing the resource crunch have to center around the hardware. There is only so much energy you can conserve by retrofitting or rearranging existing hardware. To make a difference, energy efficiency and space requirements must be among the primary selection criteria for any new acquisition.

 

Basic guidelines for evaluating energy efficiency of data warehouse infrastructure:

 

  1. How much power is consumed by all the equipment that makes up the infrastructure? Consider all the hardware, including additional networking hardware and network load. The logic here is that data warehouse operations place much higher demands on network switch usage in terms of length of time and number of cycles.

  2. How much additional cooling will be needed? Vendor specifications do not always include this information because it’s not an attribute of their product, strictly speaking. A standard approach is to assume that one watt of energy used requires one watt to cool.

  3. How much space will be taken up? Calculate at least the area the equipment will require. In many data centers, height matters also. Project out the area required to accommodate future data growth. How much extra space will be used up when you have to add 5TB capacity?

The survey by ONStor, Inc. mentioned previously summed up the current state of green IT. Sixty-three percent of the 369 IT professionals said that running out of space or power in their data centers had already occurred. Surprisingly, 40 percent said it had not prompted them to discuss green data center initiatives. Instead, they must have viewed these events as standard operational risk, or requested more budget to purchase more generators or take over more space. Indicators suggest that power and space constraints will indeed become a standard operational mode if current practices don’t change.

 

The same survey, however, found that 60 percent said they had adopted energy-efficient practices in response to these outages. Green computing is clearly not a futuristic movement addressing a potential need. Green computing is practiced now in order to prevent recurrences of power limitations, to save on energy costs, and to avoid construction, at least in a small way, by many organizations.

 

Replacing two boxes with one appliance that uses far less energy than the two boxes combined makes for a convincing equation.


References:

 

  1. ONStor Inc. “ONStor Green Data Center Survey Reveals 63 Percent Of Organizations Have Run Out Of Space, Power or Cooling Capacity Without Warning.” ONStor.com, September 18, 2007.
  2. Gartner. “Top 10 Strategic Technologies.” Gartner Symposium/ITxpo, October 9, 2007.
  3. EPA. “Report to Congress on Data Center Energy Efficiency.” Report to Congress, August 2 2007.
  4. Deni Connor. “Data Center Power Outages, Space Concerns Cited by Survey.” Network World, October 11, 2007.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access