Data center 579730 224129175 2008-07-07T13:11:06Z Golfieke 2234716 /* Physical layout */ {{Unreferenced|date=May 2007}} [[Image:NetworkOperations.jpg|thumb|right|An operation engineer overseeing a Network Operations Control Room of a data center.]] A '''data center''' is a facility used to house computer systems and associated components, such as telecommunications and storage systems. It generally includes redundant or backup power supplies, redundant data communications connections, environmental controls (e.g., air conditioning, fire suppression), and special security devices. ==History== Data centers have their roots in the huge computer rooms of the early ages of the computing industry. Early computer systems were complex to operate and maintain, and required a special environment in which to operate. Many cables were necessary to connect all the components, and methods to accommodate and organize these were devised, such as standard racks to mount equipment, elevated floors, and cable trays (installed overhead or under the elevated floor). Also, old computers required a great deal of power, and had to be cooled to avoid overheating. Security was important – computers were expensive, and were often used for military purposes. Basic design guidelines for controlling access to the computer room were therefore devised. During the boom of the microcomputer industry, and especially during the 1980s, computers started to be deployed everywhere, in many cases with little or no care about operating requirements. However, as [[information technology]] (IT) operations started to grow in complexity, companies grew aware of the need to control IT resources. With the advent of client-server computing, during the decade of 1990, microcomputers (now called "servers") started to find their places on the old computer rooms. The availability of inexpensive networking equipment, coupled with new standards for network cabling, made it possible to use a hierarchical design which put the servers in a specific room inside the company. The use of the term "data center", as applied to specially design computer rooms, started to gain popular recognition about this time. The boom of data centers came during the [[dot-com bubble]]. Companies needed fast Internet connectivity and non-stop operation to deploy systems and establish a presence on the Internet. Installing such equipment was not viable for many smaller companies. Many companies started building very large facilities, called "Internet data centers", or IDCs, which provide businesses with a range of solutions for systems deployment and operation. New technologies and practices were designed to handle the scale and the operational requirements of such large scale operations. These practices eventually migrated towards the private data centers, and were largely adopted because of their practical results. [[As of 2007]], data center design, construction, and operation is a well-known discipline. Standard documents from accredited professional groups, such as the [[Telecommunications Industry Association]], specify the requirements for data center design. Well-known operational metrics for data center availability can be used to evaluate the business impact of a disruption. There is still a lot of development being done in operation practice, and also in environmentally-friendly data center design. ==Requirements for modern data centers== [[Image:Datacenter-telecom.jpg|thumb|left|Racks of telecommunications equipment in part of a data center.]] IT operations are a crucial aspect of most organizational operations. One of the main concerns is '''business continuity'''; companies rely on its information systems to run its operations. If a system becomes unavailable, company operations may be impaired or stopped completely. It is necessary to provide a reliable infrastructure for IT operations, in order to minimize any chance of disruption. Information security is also a concern, and for this reason a data center has to offer a secure environment which minimizes the chances of a security breach. A data center must therefore keep high standards for assuring the integrity and functionality of its hosted computer environment. ==Data center classification== The [http://www.adc.com/Library/Literature/102264AE.pdf TIA-942:Data Center Standards Overview] describes the requirements for the data center infrastructure. The simplest is a Tier 1 data center, which is basically a computer room, following basic guidelines for the installation of computer systems. The most stringent level is a Tier 4 data center, which is designed to host mission critical computer systems, with fully redundant subsystems and compartmentalized security zones controlled by [[biometric]] access controls methods. ==Physical layout== [[Image:Rack001.jpg|thumb|right|A typical server rack, commonly seen in [[colocation]].]] A data center can occupy one room of a building, one or more floors, or an entire building. Most of the equipment is often in the form of servers racked up into [[19 inch rack]] cabinets, which are usually placed in single rows forming corridors between them. This allows people access to the front and rear of each cabinet. Servers differ greatly in size from [[1U server]]s to huge storage silos which occupy many tiles on the floor. Some equipment such as [[mainframe computer]]s and [[computer storage|storage]] devices are often as big as the racks themselves, and are placed alongside them. The physical environment of the data center is usually under strict control: * [[Air conditioning]] is used to control the temperature and humidity in the data center. [[ASHRAE]] recommends a temperature range of 20–25 °C and humidity range of 40–60% as optimal for data center conditions.<ref>{{cite web|url=http://www.serverscheck.com/blog/2008/07/why-monitor-humidity-in-computer-rooms.html|title=ServersCheck's Blog on Why Humidity Monitoring|date=[[July 1]], [[2008]]}}</ref> The electrical power used by the electronic equipment is converted to heat, which is rejected to the ambient air in the data center space. Unless the heat is removed, the ambient temperature will rise, resulting in electronic equipment malfunction. By controlling the space air temperature, the server components at the board level are kept within the manufacturer's specified temperature/humidity range. Air conditioning systems help control space [[humidity]] within acceptable parameters by cooling the return space air below the [[dew point]]. Too much humidity and water may begin to [[condensation|condense]] on internal components. In case of a dry atmosphere, ancillary humidification systems may add water vapor to the space if the space humidy is too low, which can result in [[electrostatics|static electricity]] discharge problems which may damage components. * Backup power is catered for via one or more [[uninterruptible power supply|uninterruptible power supplies]] and/or [[Electrical generator|diesel generator]]s. * To prevent [[single point of failure|single points of failure]], all elements of the electrical systems, including backup system, are typically fully duplicated, and critical servers are connected to both the "A-side" and "B-side" power feeds. This arrangement is often made to achieve [[N+1 Redundancy]] in the systems. Static switches are sometimes used to ensure instantaneous switchover from one supply to the other in the event of a power failure. * Old data centers typically have [[raised floor]]ing made up of 60 cm (2 ft) removable square tiles.The trend is towards 80-100cm void to cater for better and uniform air distribution. These provide a [[plenum]] for air to circulate below the floor, as part of the air conditioning system, as well as providing space for power cabling. Data cabling is typically routed through overhead [[cable tray]]s in modern data centers. But some are still recommending under raised floor cabling for security reasons and to consider the addition of cooling systems above the racks in case this enhancement is necessary. Smaller/less expensive data centers without raised flooring may use anti-static tiles for a flooring surface. * Data centers feature [[fire protection]] systems, including [[passive fire protection|passive]] and [[active fire protection|active]] design elements, as well as implementation of [[fire prevention]] programs in operations. [[Smoke detectors]] are usually installed to provide early warning of a developing fire by detecting particles generated by smoldering components prior to the development of flame. This allows investigation, interruption of power, and manual fire suppression using hand held fire extinguishers before the fire grows to a large size. A [[fire sprinkler system]] is often provided to control a full scale fire if it develops. [[Clean agent]] fire suppression gaseous systems are sometimes installed to supress a fire earlier than the fire sprinkler system. Passive fire protection elements include the installation of [[fire walls]] around the data center, so a fire can be restricted to a portion of the facility for a limited time in the event of the failure of the active fire protection systems, or if they are not installed. * Physical security also plays a large role with data centers. Physical access to the site is usually restricted to selected personnel. [[Video camera]] surveillance and permanent [[security guard]]s are almost always present if the data center is large or contains sensitive information on any of the systems within. ==Network infrastructure== [[Image:Paris servers DSC00190.jpg|thumb|left|An example of "rack mounted" servers.]] Communications in data centers today are most often based on [[computer network|networks]] running the [[Internet protocol|IP]] [[protocol (computing)|protocol]] suite. Data centers contain a set of [[router]]s and [[Network switch|switch]]es that transport traffic between the servers and to the outside world. [[Redundancy (engineering)|Redundancy]] of the Internet connection is often provided by using two or more upstream service providers (see [[Multihoming]]). Some of the servers at the data center are used for running the basic [[Internet]] and [[intranet]] services needed by internal users in the organization, e.g., [[e-mail]] servers, [[proxy server]]s, and [[Domain Name System|DNS]] servers. Network security elements are also usually deployed: [[firewall (networking)|firewalls]], [[VPN]] [[Gateway (computer networking)|gateways]], [[intrusion detection system]]s, etc. Also common are monitoring systems for the network and some of the applications. Additional off site monitoring systems are also typical, in case of a failure of communications inside the data center. ==Applications== [[Image:Floridaserversfront1.jpg|thumb|right|Multiple racks of servers, and how a data center commonly looks.]] The main purpose of a data center is running the applications that handle the core business and operational data of the organization. Such systems may be proprietary and developed internally by the organization, or bought from [[enterprise software]] vendors. Such common applications are [[Enterprise resource planning|ERP]] and [[Customer relationship management|CRM]] systems. A data center may be concerned with just [[operations architecture]] or it may provide other services as well. Often these applications will be composed of multiple hosts, each running a single component. Common components of such applications are [[database]]s, [[file server]]s, [[application server]]s, [[middleware]], and various others. Data centers are also used for off site backups. Companies may subscribe to backup services provided by a data center. This is often used in conjunction with backup tapes. Backups can be taken of servers locally on to tapes., however tapes stored on site pose a security threat and are also susceptible to fire and flooding. Larger companies may also send their backups off site for added security. This can be done by backing up to a data center. Encrypted backups can be sent over the Internet to data center where they can be stored securely. For disaster recovery several large hardware vendors have developed a mobile solution that can be installed and made operational in very short time. Vendors as [[Cisco systems|Cisco]]<ref>{{cite web|title=Info and video about Cisco's solution|url=http://www.datacenterknowledge.com/archives/2008/May/15/ciscos_mobile_emergency_data_center.html|publisher=Datacentreknowledge|accessdate=2008-05-11|date=[[May 15]], [[2007]]}}</ref>, [[Sun Microsystems|Sun]] <ref>{{cite web|url=http://www.sun.com/products/sunmd/s20/specifications.jsp|title=Technical specs of Sun's Blackbox|accessdate=2008-05-11}}</ref><ref> And English Wiki article on [[Sun Modular Datacenter|Sun's modular datacentre]]</ref> have presented their solutions and also [[IBM]] developed a [[modular]] system that could be used for this purposes.<ref>{{cite web|url=http://www.crn.com/hardware/208403225|publisher=ChannelWeb|accessdate=2008-05-11|title=IBM's Project Big Green Takes Second Step|first=Brian|last=Kraemer|date=[[June 11]], [[2008]]}}</ref> ==See also== * [[Central apparatus room]] * [[Colocation center]] * [[Disaster recovery]] * [[Electrical network]] * [[HVAC]] * [[Internet exchange point]] * [[Network operations center]] * [[Peering]] * [[Server farm]] * [[Server room]] * [[Sun Modular Datacenter]] * [[Telecommunications network]] ==References== {{Reflist}} ==External links== * [http://hightech.lbl.gov/datacenters.html Lawrence Berkeley Lab] - Research, development, demonstration, and deployment of energy-efficient technologies and practices for data centers [[Category:Data management]] [[Category:Servers]] [[ar:الداتا سنتر]] [[cs:Serverovna]] [[de:Rechenzentrum]] [[es:Centro de proceso de datos]] [[fa:مرکز داده]] [[fr:Centre de traitement des données]] [[gl:Data Center]] [[id:Pusat data]] [[it:Centro elaborazione dati]] [[lt:Duomenų centras]] [[nl:Rekencentrum]] [[ja:データセンター]] [[pt:Centro de Processamento de Dados]] [[ru:Датацентр]]