Intel is adding new sensors to its server chips to help companies improve the efficiency of data centre cooling systems, with a view to cutting operating costs and prolonging the life of equipment, the company said.
“Intel will add sensors for measuring the inlet and outlet temperatures on servers, and also the airflow passing through systems. Its chips already include sensors that measure power consumption,” said Jay Vincent, senior solution architect with Intel’s high-density cloud computing division.
“Intel will make the data available for use by tools that model airflow and cooling in data centres, providing a more accurate way to uncover hot spots and cold spots, and to run simulations that show where to put new IT equipment for the greatest cooling efficiency,” he said.
At the DatacentreDynamics conference in San Francisco last week, Intel presented results from a proof-of-concept it conducted with Future Facilities, which develops CFD (computational fluid dynamics) software for simulating airflow.
“The test showed that the on-chip sensors allow CFD tools to model more accurately how air is moving in a data center, and also to predict more accurately how new IT equipment will affect airflow,” said Akhil Docca, engineering and product manager at Future Facilities.
“If CFD modeling is done properly you can really get closer to the physics of the data centre, and if you can visualise your airflow, you can manage it,” he said.
“The proof-of-concept was done on two server racks zoned off from the rest of the data centre. The next step is to try it on a larger scale,” he said.
“The power used to run cooling equipment such as chillers and air handlers accounts for as much as 40% of the cost of running a data center. The U.S. Department of Energy has estimated that half the cooling capacity in its data centres is wasted through inefficiency,” Vincent said.
Cooling systems are carefully configured to provide the right amount of cold air needed to cool a data centre. But when IT departments install new equipment, airflow patterns get disturbed, warm and cold air mixes, and cooling capacity is wasted.
Some data centres install wireless sensors to help manage airflow. But they can be expensive to deploy and are often installed at the top of racks, on air conditioning units and elsewhere away from the servers themselves. That makes the data less accurate, according to Vincent.
“What Intel proposes is to establish the server as the source of the data,” he said. “It’s the source of demand [for cooling], so it makes sense that it should be the source of data as well.”
The CFD tool collects the temperature, airflow and power data from the servers, aggregates it to the rack level, and runs a simulation to identify places where cold air is bypassing server inlets, or where exhaust air is being recirculated.
“We use that real-time data to see what’s happening today, but also to project what might happen if you bring in 300 servers tomorrow,” Docca said.
“The data will be available to other management tools besides CFD software,” Vincent said.
Intel will enable the sensors in components it sells to “white box” vendors next year, he said. For the big server vendors, it will be up to them whether they enable the sensors in their equipment, and it’s unclear if all of them will. The sensor data won’t be free, Vincent said — Intel licenses a software tool called Node Manager to read the power data it provides currently.
Dell already puts temperature sensors in its PowerEdge servers, including for inlet and outlet temperatures, said Eric Wilcox, power and cooling product manager for Dell. But if Intel makes them part of its chip platform it could reduce costs for manufacturers and also provide a set of standards for reporting sensor data.
That should make life easier for companies developing third-party management tools, and help customers who manage servers from multiple vendors.
“We’re aware of Intel’s value proposition here and plan to pass it on to customers,” Wilcox said.
HP also puts temperature sensors in its ProLiant servers — 32 of them, according to Michael Kendall, a group manager for HP’s industry-standard servers and software. It feeds the data into its server management tools, which can perform tasks like varying fan speeds depending on how much cooling is needed. It also makes the data available to nlyte Software, a maker of data-centre-infrastructure management software, through a partnership.
“Intel’s work might provide HP with more data it can take advantage of,” Kendall said.
“One challenge is educating customers about the technologies available to them. Most customers probably don’t know about all the instrumentation data that’s already available to them,” Wilcox said.
For the sensors to achieve their full potential, IT and facilities staff will have to work together more closely. The ultimate goal of this project and a similar one conducted last year by Lawrence Berkeley National Labs is to link IT equipment directly to power and cooling systems, so that supply can be adjusted automatically to match demand.
“We’re going to supply the ability to be more sophisticated,” Vincent said, “but you’ve got to take advantage of it, otherwise nothing’s going to change.”