University Data Center
The data center is located in Pollak Library North. It contains one 160 KVA UPS, two 80 KVA UPS and has a generator fail over system to maintain constant power for servers and equipment. For cooling the data center has five Air handlers, a 80 ton chiller and uses central plant water as fail over. The data center also houses the campus PBX however mainly contains production servers. Production servers are identified as a server that is in a production state such as it is in use for the campus best interest in that it must remain online at all times. Production servers are to include servers that provide web, storage, email, applications, services, Blackboard and other supporting equipment to keep the campus networks, applications and services operational. Such servers that are placed in the Data Center located in PLN-020 will be supported with adequate cooling, power, and network support at a minimum. While the data center is primarily used by the division of Information Technology it opens it doors to other divisions and campuses to provide central placement of servers and provide redaundancy and back up disaster recovery off campus.
The university data center facilities provide for the reliable operation of the campus computing systems and computing infrastructure including the campus portal, web servers, telecommunications systems, computer network switching infrastructure, fiber and copper networks, etc. The data center facilities involve 5 broad categories of tasks: Maintenance, Planning/Deployment, and Training/Monitoring/Reporting. The UDC provides for the 24 hours 7 days a week operation of the data center facilities. The goal is to also expand the use of the facilities to provide additional space for departmental systems across the university to save on power and staff labor and provide general higher uptime by being housed in the facilities.
Most training in the data center is imprompt training as needed by new employees that require access to the data center.
Possible training sessions include:
- General safety in data center
- Using IPKVM (Internet Protocol Keyboard Video Mouse) remote access to servers and use.
- Other possible training: Cabling servers, Monitoring sytems (Climate, Servers, Network and Power), Alerts and alarms in data center.
Information Technology is focusing on virtualizing services within the data center. This will allow us to remove multiple servers to lower the carbon footprint of our current load. This also allows us to move servers off physical servers that are now out of warranty or upgrade to newer 64 bit platforms from older server equipment that are not compatible or save the cost of purchasing new servers to replace existing aging server hardware. Virtual servers are also quicker to recover from a platform failure for less down time. Most critical systems such as exchange will not be virtualized but some of the componants that are used by exchange will be virtualized as an example.
Saving power to be Green
The main focus on savingpower is cooling, virtualizing servers and power monitoring. By metering power we can measure the power used between the actual powering of the equipment and the power used to cool the equipment. For cooling we have made many changes to improve and make better use of the data center cooling system. We started by creating hot and cold aisles along with ducting the return air from the plenum to each air handler from the hot aisles. Then we continue to seal up any holes used for cabling to force air to be used in the cold aisles where we want it. IT then upgraded the chiller to a multi stage chiller central plant wired the controls so they can better manage the cooling between the dedicated chiller and plant cooling system. The plant cooling system is used as University Data Center's backup system. Physical Plant is also looking into an automated system to adjust airflow, airhandlers, chiller and plant cooling to assist in lowering the cooling costs of the data center. Currently IT is migrating servers into virtual servers. This is the best cost savings since we can actually turn off systems to save power and reduce the cooling requirements of the data center.