…………..Colocation is a type of hosting where customers pay to have their own servers hosted within a data centre; colocation services are available via some web hosts through the use of their private cages, or you can pay a data centre directly to have your server located in a cage which is run by the datacentre itself. Other equipment such as networking equipment and storage equipment can also be colocated within server racks. Colocation is seen as a more enterprise level alternative to dedicated server hosting since the equipment hosted is owned by the person or business that is paying the lease. More and more businesses are starting to realize the benefits of colocation hosting; the main reason for this being that with regular dedicated server hosting you don’t receive the level of support or technical know how that you do with colocation hosting. Colocation hosting has many advantages above ordinary dedicated hosting server, such as the fact that most data centres provide a ‘remote hands’ service if your server needs a task doing such as an OS reinstall which can only be done from the console; also, the connectivity that your equipment receives will probably be better than that you would receive on a dedicated server, the reason being that your equipment is surrounded by your own equipment meaning that you don’t have any other customers around you that could be a burden on your connection speed. Colocation is often abbreviated as ‘colo’; the colocation centres themselves are often referred to as ‘carrier hotels’ because of the number of internet carries that the host, as well as the number of businesses which have their servers located within a colocation centre.
Standard Features of a Colocation Centre
Colocation centres are often built in a certain way for the maximum benefit of the servers and equipment which they host. Colocation centres are more or less ordinary data centres, and so always have some sort of fire suppression system that makes use of pipes to ensure that the fluid used to put fires out can spread across the building rapidly to ensure that the fire doesn’t damage too much of the building – that is if the fire can be controlled. Regular internet equipment is around 19 inches wide – the same can be said for an ordinary server, this means that both colocation and data centres have a large collection of 19 inch wide data racks to cope with customer demand; most also carry a collection of 23 inch wide data cabinets which is used to hold any telecoms equipment that customers may wish to have hosted at the location concerned. Most 19 and 23 inch wide data cabinets are lockable to ensure the safety and security of the equipment that is contained within; this is perfect for large corporations or business’s whose servers contain highly sensitive information since they are able to leave their equipment there with peace of mind. Because of the nature of both colocation and data centres, they both contain alot of cables (both data and power) to ensure that the equipment that they host is able to be powered and connected to the internet; it is because of this that you will always find overhead cable racks in both to ensure that more cables for more equipment can be easily set out. One thing that you might also come across is the fact that the power for the equipment is contained in another rack – sometimes referred to as the power distribution rack; this is done to ensure that the most can be made from the space contained within both data and telecoms racks. One main feature of either a colocation or a data centre is air conditioning to ensure that the hosted equipment is kept at a cool temperature; the air conditioning within both types of centre is normally pushed through the raised floors that both contain and then released from under the cabinets that hold the equipment to ensure that every piece of equipment receives cool air to keep its temperature down. The amount of cooling that is available in both a colocation and data centre and restrict the amount of servers or how much equipment can be hosted within the facility; this is because if there isn’t enough cooling available then a fire hazard can be created – some think that it is the floor space (sometimes referred to as square footage since it is measure in square foot) that determines how much equipment can be hosted within a facility. Another idea used to ensure that the temperature of equipment is kept to a minimum is to ensure that the data floors used contain little or no windows; this helps since sunlight cannot reach the equipment or floor itself which can help keep the temperature down, especially on hot summer days.
Because of the nature of the equipment that is hosted within both a colocation and data centre, both require that a member of staff escort a customer across the data floor to the cage in which their equipment is hosted; in this case the member of staff is also required to stay with the customer whilst they do what they need to do to their equipment. This is done because in most cases the equipment of other customers is normally contained within the same rack; if a customer has their own rack or cabinet within the colocation or data centre then they are normally provided with some type of access card which they can use to gain access to the data floor, and then use their own set of keys to unlock the cabinet in which their equipment located. Some colocation and data centres used the biometrics of clients as a pass key, as that ensures that person entering the data floor is the customer that they have on file. Both types of centre also contain many CCTV cameras and staff to monitor them to ensure that no unauthorised personnel manage to gain access to the data floor. In terms of technical security, the use of equipment such as firewalls is down to your own specification, since you are more or less hosting your own equipment; in some colocation and data centres where you are sharing a rack with other customers, the host company will in most cases provide the rack with some sort of security device or firewall to ensure the safety of all hosted equipment. However, if you do have your own rack then you will most likely have to provide the firewall or a device of similar standards yourself.
Power sources within a modern day colocation or data centre are normally redundant, meaning that if some power supply gives way then there is another source available to keep the hosted equipment running. The method used by most facilities is to deploy diesel generators to supply the power to the servers and other equipment that may be hosted in the event of a mains power black out; some facilities employ UPS (uninteruptable power supply) between the event of a blackout and the diesel generators – a UPS in a simple context is a backup battery which stores power which can be used in the event of a blackout, which can then be recharged when the power is restored. In some cases, large corporations with their own equipment racks or cages deploy their own UPS amongst their equipment to compliment those power services of the hosting facility as well as to ensure the best redundancy for their equipment can be achieved in the event of a blackout. For additional redundancy, most colocation and data centres have multiple connections to different locations within the local power grid to ensure that if one feed does go down, there is at least one other available to power the equipment; this method is good since it means that the diesel generators or UPS supply banks don’t have to be relied on.
Most colocation and data centres have multiple feeds to different bandwidth carriers to ensure the best redundancy; this means that if one connection was to fail then there would be at least one other connection to the internet or service provider, to ensure that the hosted equipment was still connected and online. Some facilities contain rooms known as ‘meet me rooms’; this is because most peer points are contained with in data centres and colocation facilities to ensure that corporate customers are able to achieve the best possible connections. The idea of a ‘meet me room’ is to enable all carriers at the data or colocation centre to efficiently transfer data. In some cases, there may be a large internet exchange hosted within the facility where customers are able to link up for peering.