In the last decade, the number of data centres around the world has increased dramatically. It’s estimated that there are now around 10,000 data centres currently in operation, with almost half of these being in the United States1. In the last few years, there has been a strong trend towards the construction of hyperscale data centres (generally defined as having at least 900 m2 of space and 5,000 or more servers), with growth in this area being predicted to increase at an annual rate of 3.4% CAGR by 20262.
Growth is being driven by a number of often interconnected factors: the rise of robotics, AI, digital twinning, autonomous vehicles, the Internet of Things (IoT) and streaming services, plus the impact of the Covid-19 pandemic and the move to home working.
One of the biggest challenges facing data centre operators, especially the so called HyperScale Operators such as Amazon and Microsoft, is reducing the level of energy consumed. Although the energy efficiency of data centres has improved in recent years, with energy consumption for traditional centres remaining relatively static, even as the number of facilities has increased, the energy consumed by the sector as a whole has steadily risen with the growth in the number of hyperscale data centres. According to Statista, in 2021 globally this was in excess of 190 terawatt-hours3.
A significant proportion of this energy is used to power the data processing and storage equipment, plus the data security systems and network infrastructure. Almost all of this energy is eventually converted to heat.
Elevated temperatures can affect the performance and reliability of IT equipment. In the worst case, it will lead to failure, which can be costly. For example, Forbes estimated that the recent 6-hour outage of Facebook cost the company around $100 million4 - that’s around $3,000 a minute!
The air-cooling systems that have traditionally been used to control data centres temperatures consume high levels of energy. As a result, many data centres now rely on passive liquid cooling, which has the advantage of being more energy efficient and able to reduce temperatures faster. Other options that are being employed include locating data centres to colder regions, such as Scandinavia, or in the case the Microsoft Project Natick5, building a data centre on the seabed.
Regardless of the physical location of the data centre or the type of cooling system, it is critical for effective management and control that both temperature and humidity are continuously monitored.
The risks from elevated temperatures have been noted above. Humidity can also be a problem. If it is too high, then moisture can form on sensitive components such as motherboards, hard drives and connection devices, potentially causing corrosion and equipment failure. Conversely, if humidity levels are too low, there is increased likelihood that electrostatic charges will build-up, with a subsequent high voltage discharge damaging sensitive components.
Ideally, data centres should be controlled within a predetermined band of environmental conditions to ensure the performance and longevity of data systems. Typically, temperatures must be maintained between 18 and 27°C, while dew-point should be within the range 5 to 15°Cdp and humidity no higher than 60%rh. This ensures hardware remains safely at a suitable temperature, with the risks of condensation and static build-up bring minimised.
Achieving the specified control range requires precision sensors and advanced control systems, for relative humidity, temperature, differential pressure and low dew-point. These devices are used both to manage the data centre assets, in terms of system reliability and capacity, and to provide the information on which to base strategies for reducing energy consumption and thus operating costs.
Sensor accuracy is a key factor: the more precise the sensing device, the greater the degree of control over operational conditions, especially energy consumption. At Rotronic, we offer a range of advanced solutions for effective management and control of temperature and humidity; these are capable of delivering exceptionally high levels of accuracy, with minimal drift over time. Each instrument is designed for a long, trouble-free operating life, with extended MTBF intervals, and can be configured for fast on-site recalibration. They are generally available with short delivery times, can easily be hot-swapped to minimise downtime, and are supported by industry-leading technical support services.
With over 55 years of experience in the development of innovative precision instruments, we are the application experts in temperature and humidity measurements for all data centre applications. If you would like to discuss your requirements, then please contact our team today.
Sources:
1Datacenter Applications
2Synergy Research Group number of data centres
3Research&Markets Hyperscale data centre report
4Forbes downtime costs
5Microsoft Project Natick
Sign up to one of our Industry newsletters and you’ll receive our most-recent related news and insights all directly to your inbox!
Sign Up