Modern And Reliable
Data Centers

Sustainabilty and Efficiency
Two world class hosting facilities, one in the United Kingdom and one in the United States.Complete disaster recovery and failover solutions to one of our facilities as part of our complex managed hosting solution.

London
East London is the site of the primary hosting facility. It sets new standards in energy efficiency, embracing cutting-edge technology to ensure sustainability, steadfast security and superior reliability.
Few people realise servers require a huge amount of power and that a vast amount of additional power is required to keep servers cool.
Power Usage Efficiency (PuE) is the benchmark that gauges how green and efficient a datacentre truly is.
Less is more
It is important to use clean energy but equally important to use less energy.
PuE is a ratio that describes how efficiently a data center uses energy; specifically, how much energy is used by the computing equipment, in contrast to cooling and other overhead that supports that equipment.
At one time it was thought that Google’s PuE of 1.21 was as close to perfect as possible. But with advancements in energy-efficient cooling technology and through robust data center design, our latest hosting facility in East London is in a world-leading class of its own designed to achieve a Power usage Efficiency (PuE) of 1.05.

While the average datacenter today consumes 50% extra power to support cooling and overhead, our facility is already operating at an impressive 1.09 — using just 9% additional energy beyond what’s needed to power the servers themselves. As the facility scales to full capacity, we expect this to improve even further, moving closer to the 1.05 design target.
It starts with using the latest hyper-efficient technology available, such as EcoCooling’s CloudCooler units, which combine free air cooling with evapourative cooling. On cold days, we can simply use the outside air to chill the servers. On hot days (yes even those 40C+ days) we pass the warm outside air over water-saturated pads, chilling it in the process. This is infinitely more environmentally friendly than traditional air-conditioning chillers, with no nasty refrigerants needed, whilst still being 100% ASHRAE 9.9 compliant (read: mission critical reliability). The water used to chill the air is almost entirely recycled requiring only very occasional top ups, so zero concerns when it comes to being conscious about water scarcity.
Within the data halls, cold-aisle containment is used, with Ziehl Abegg high-efficiency fans channelling the cool air directly into contained aisles for the servers to draw in and expell hot air from the other side. This compares to a data center which has no containment (i.e. a constant battle to re-chill hot air) or hot-aisle containment, which although more energy-efficient than no containment at all, means that a much larger volume of air has to be chilled.
Energy efficiency aside, the entire windowless facility has industry-leading physical security, with 2.4m high razor wire perimeter fences, door access control, motion detection, 24/7 CCTV and on-site human presence.
The motion detectors fitted throughout are not only there for security, but in another nod to energy-efficiency, also mean that the lights only need to be on if someone is actually in the room.
Everything is entirely redundant, with diverse mains power feeds, diverse fibre networking supplies, N+1 diverse uniterruptible power systems and diesel generators with 8 hour day tanks, 48 hour on-site additional supply and a 24×7 continuous fuel arrangement.
