Kmaupdates

Keeping It Cool Within the Edge Data Centre

Views: 1

Jonathan Duncan, Vertiv

By Jonathan Duncan, technical director, Africa at Vertiv

 

Johannesburg, South Africa  – Digital transformation means organisations are becoming increasingly reliant on information technology to run almost every aspect of their business. The creation of more data – which needs to be processed and stored – brings with it the corresponding need for more compute power and more data centres. In many cases, these are not large, purpose-built data centres, but rather close-proximity, smaller edge facilities that share space in the same building as the rest of the business. This, in turn, can create unique challenges with regards to securing the environment and cooling the IT loads.

 

On the edge

 

Edge computing can be described as the concept of having compute and storage capacity physically close to where users are generating, consuming and manipulating data. We are seeing a rise in edge computing (also known as decentralised IT) driven by such factors as the ongoing rise of the Internet of Things (IoT), which generates significant amounts of data; legal requirements; the need to consolidate data; high network costs; latency issues, and network security requirements.

 

When the latency involved in sending data to a centralised or cloud data centre is too long and becomes an inhibitor, this drives the need for localised processing capabilities, in other words edge data centres.

 

However, these edge data centres must often share an area that’s already serving another purpose, and which may not have cooling systems intended to handle the IT equipment at the edge of the network.

Checking the cooling requirements

 

IT equipment can produce large amounts of heat on a continuous basis. Organisations must therefore take steps to ensure the proper cooling of that equipment in order to protect it and ensure its availability. Placing sensitive IT equipment into spaces designed originally for some other purpose can present challenges, especially with respect to cooling.

 

For example, an office building is optimised to be comfortable for its employees, while a more open space, such as a factory floor or warehouse, has its own heat and cooling requirements. In either case, the existing cooling system may not meet the stringent requirements for the proper functioning of a data centre. This is especially true for high-density IT equipment, including hyperconverged infrastructure, which can generate large amounts of heat from a relatively small space.

 

Controlled versus uncontrolled environments: what is the difference?

 

Companies are now routinely installing edge data centres in two general categories of spaces, as follows:

  • Controlled office environments, which are geared towards standard comfort cooling for humans; and
  • Uncontrolled environments such as manufacturing spaces, which may or may not have ambient cooling and humidity control in place.

 

The typical office environment uses room-based cooling systems provided by building heating, ventilation and air conditioning (HVAC) systems, or de-centralised mini-split cooling systems. Cooling capacity is calculated based on the heat load the comfort cooling system needs to handle, typically measured in watts (W) or kilowatts (kW).

 

A typical office HVAC system could have a cooling capacity to deal with a heat load in the range of 50 to 100W per square metre, or perhaps 1 to 2kW for an entire room. But a single rack of IT equipment may produce a heat load of 3 to 4kW or more. As a result, a cooling system designed for 1kW of cooling could now be asked to deal with as much as four times that capacity.

 

This is likely to have several repercussions:

  • Employees may be uncomfortable as the comfort system struggles to maintain a target temperature.
  • IT equipment (such as servers) often has thermal protection systems that trigger a shutdown if the temperature rises too high.
  • Overtaxing the comfort HVAC system by requiring it to continually operate above its rated cooling capacity will drive up operating costs over the long term, and probably also cause its early failure, as it is typically not engineered for continuous operation.

 

Another issue with typical office buildings is humidity. With doors and potentially windows opening and closing all the time, humidity levels can change constantly, depending on conditions outside. This is not ideal for IT equipment. Similarly, dust does not facilitate the optimal functioning of IT equipment, which is why purpose-built data centres have air filtration systems that remove dust and other particulates from the air.

 

As well as office environments, many companies need to install edge data centres on factory floors, in manufacturing facilities, and in warehouses that have widely differing environmental characteristics. These are typically large spaces of 200 square metres and upwards.

 

Any warehouse faces challenges in maintaining a constant temperature, as they aren’t usually well-insulated or sealed. This can lead to excessive heat or cold inside the facility, as well as wide fluctuations in terms of seasonal temperatures. Again, humidity can become an issue, especially in facilities with no ambient air control system in place. Uncontrolled environments are also likely to be even dustier than offices, and less likely to have a particulate filtration system in place. Dust combined with high humidity can be particularly harmful to IT equipment.

 

What are the cooling options?

 

Theoretically, the same cooling systems that apply in an office environment can also be used in uncontrolled locations, but most customers favour an approach that involves a sealed rack or row, which means dedicated rack- or row-based cooling.

 

The reason is simple: these self-contained systems essentially seal off the IT equipment from the outside air, thus protecting it from dust and humidity, while enabling the company to tightly control the temperature of the racks.

 

The key is to look for a system with a high Ingress Protection (IP) rating as defined in the International Electrotechnical Commission (IEC) 60529 specification, which covers mechanical and electrical enclosures intended to protect against intrusion, dust and water.

 

An IP 54 rating, for example, means an enclosure offers strong protection against dirt, dust, oil, and splashing water – all enemies of IT equipment. With such an enclosure, the IT equipment is isolated from the environment in which it’s installed.

 

Combined with a rack- or row-based cooling system, companies can tightly control the temperature for IT equipment while also protecting it from its potentially harsh surroundings. Such a setup can also be highly efficient since IT equipment can withstand far higher temperatures than is comfortable for humans.

 

Operating the IT cooling system at higher allowable temperature envelopes would save significant sums on cooling costs versus using temperatures closer to the comfort levels set for humans.

 

Cooling solutions for any environment

 

As companies continue on their digital transformation journeys, they will be producing more data, which in turn will demand more edge data centres for processing. These facilities will need to be in close proximity to the data origination source, be it an office facility, warehouse, or manufacturing floor. Companies therefore also need racking and cooling strategies that can cover edge data centres located anywhere.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top