July 5, 2007 By Jessica Weidling
When it comes to energy, Texas is both green guru and oil refiner king. The Lone Star State commands a distinctly diversified energy portfolio - earning the No. 1 spot as top wind power generator since 2006 - and is directing a massive, electricity-saving statewide data center project while still consuming more energy than any other state.
But with once-affordable natural gas prices ballooning and warnings of climate change growing more urgent, Texas' public and private sectors are wielding new ideas. Forward-thinking tech companies are forging government partnerships. And as new information on data center energy consumption becomes available, IT managers have the incentive and wherewithal to yield big energy savings.
While energy savings and environmental stewardship jump-start many green initiatives, Texas has an added reason to embrace the green craze - the state's energy costs have increased considerably in the last five years.
In the past, the Texas power grid - which powers more than three-fourths of the state - has let Texas retain some energy independence from federal regulators and offer natural gas at cheap rates. But market conditions since 2000 forced some power plants to close, decreasing the state's once-abundant energy supply and increasing prices.
"Where managing energy costs was not as high on the radar screen five years ago for many of these public entities, it's a very significant piece of their operating costs now," said Dub Taylor, director of the State Energy Conservation Office (SECO). "So now they're trying to figure out, â??What can we do about this?'"
The biggest opportunity to reduce costs lies not with renegotiating power rates, he said, but with modernizing aging facilities and dampening demand.
Greening Data Centers
The total power used by servers, cooling and corresponding infrastructure represented 1.2 percent of all U.S. electricity consumption in 2005, according to a report released in February by the Lawrence Berkeley National Laboratory (LBNL). That's equal to the power used by Mississippi, according to the report's author Jonathan Koomey, consulting professor at Stanford University, and a staff scientist at the LBNL.
In 2005, the U.S. spent $2.7 billion to power servers and associated infrastructure, while the entire world shelled out more than $7 billion.
The report raises awareness of the problem, Koomey said. "The whole industry needs to understand what the basic picture is, and the only way to do that is to have some sort of objective measure of the total consumption."
From 2000 to 2005, the electricity used by servers in the United States and worldwide doubled, according to the report. This can be largely attributed to the increased number of servers and related escalation of auxiliary electrical equipment.
IT managers can focus on improving IT equipment and/or infrastructure to make data centers more efficient, he said.
"Certainly with the server, you could improve the power supply; you could improve the DC [direct current]-to-DC conversion; you could improve the fans; you could improve the air flow; you could also improve the microprocessor," Koomey said, adding that tests now show server facilities powered with high-voltage DC, instead of both AC (alternating current) and DC, are more energy efficient.
Another problem is server capacity that's being far from maximized. "With most business applications, the servers are running at 5 percent or 10 percent of the IT load," he said. This valuable, but unused, server capital is luring more organizations toward server virtualization, which basically means segmenting a server to allow multiple virtual machines, with different operating systems, to function in the same physical machine, running not just one, but several applications simultaneously.
The data center issue is especially timely in Texas, where the state is consolidating its 30 state data center sites to two.
The Texas Department of Information Resources (DIR) awarded IBM the seven-year