- Colocation Services
- Customer Stories
Colocation has a global market size of $30 billion USD. But what are colocation data center services?
Colocation (sometimes shortened to ‘colo’) is a method of hosting one companies servers, storage and networking equipment in a data center operated by another company. Instead of running their own data center companies ‘co-locate’ their equipment by renting space, power and telecoms connectivity in a multi-tenant data center. The key point is that the customer owns the IT equipment but rents the required physical space, power and cooling to host it within the colocation data center.
The 5 reasons shown below contribute to a compelling business case to include colocation
in your digital transformation program. Focus on your core competence whilst striking the
right balance between control, risk and cost.
There are five main reasons a company might choose colocation over building its own data center. Usually one of the primary motives is the capital expenditures (CAPEX) avoided by not designing, building, operating and maintaining (and updating) such a specialised facility.
A second reason is that colocation data centers deliver much higher levels of security, including 24/7 security guards, mantraps and biometric access control. Other physical security measures normally include controlled access to the immediate area around the building, secure doors, CCTV and access control systems once inside the building.
A third consideration is the power and cooling resilience measures, normally implemented by colocation data centers. These systems providing electrical power backups and UPS devices to protect against outages caused by utility grid outages, natural disasters and, ironically, systems failures within the data center itself.
A fourth business driver for considering data center colocation is business flexibility. The ability to quickly scale space and power and other variables up and down according to business needs is a big competitive advantage and is not practical if you are running your own data center in house. Making a business case for building a facility two or three times as big as you need today and leaving most of it empty is very difficult. A colocation facility solves that issue as the provider can build at scale for multiple customers.
The fifth and final reason a company will choose colocation is for business continuity purposes in the event of a systems failure at their on-premises data center. Because on-premises, meaning self-operated data facilities, are typically more vulnerable to outages a company can plan ahead for this by hosting backup servers and storage at a colocation data center.
Colocation space is rented out in terms of ‘racks’, sometimes called ‘cabinets’. A rack is a standardized metal frame for mounting IT equipment such as servers, networking devices and storage. Equipment that is to be mounted in a rack is measured in vertical rack units (U), where one rack unit is 1.75 inches high. A full size rack is typically 48U high, 600 – 800mm wide and 1000mm to 1200mm deep.
Colocation power is typically sold by the kWh (Kilowatt Hour) and is measured as the amount of power the IT equipment is using “at the wall”, meaning the amount of power actually consumed by the customer. However, the data center consumes more power than this – it also uses energy to heat, light and cool the building, as well as energy losses in the electrical distribution system. Therefore, data centers typically charge an overhead multiplier of anything up to 2x on the power the customer equipment actually uses. This multiplier is normally driven by the data center efficiency, defined in a metric knows as PUE – Power Usage Effectiveness. Choosing a data center that has very high efficiency can have a big impact on the overall cost of the colocation contract.
TPO CASE STUDY
TPO’s area of expertise is providing ‘temporary’ telecoms infrastructure lasting weeks or even years for musical events and other challenging environments. Major infrastructure projects such as E4 Stockholm bypass requires absolute stability. Given the vast global audiences and mission critical projects 100% uptime is vital.