Skip to Content (Press Enter)

 California State University, Fullerton

 

Print

Text Size: Small Text Medium Text Large Text

University Data Center

The data center is located in Pollak Library North. It contains one 160 KVA Uninterruptable Power Source (UPS), two 80 KVA UPS and has a generator fail over system to maintain constant power for servers and equipment. For cooling the data center has five air handlers, an 80 ton chiller and uses central plant water as fail over. The data center houses the campus PBX however mainly contains production servers. Production servers are identified as a server that is in use for the campus' best interest and it must remain online at all times. Production servers provide web, storage, email, applications, services, Moodle and other supporting equipment to keep the campus networks, applications and services operational. Servers that are placed in the Data Center will be supported with adequate cooling, power, and network support at a minimum. While the data center is primarily used by the division of Information Technology, it opens it doors to other divisions and campuses to provide central placement of servers and  redaundancy and back up disaster recovery off campus.

The university data center facilities provide for the reliable operation of the campus computing systems and computing infrastructure including:  the campus portal, web servers, telecommunications systems, computer network switching infrastructure, fiber and copper networks.  The data center facilities include 5 broad categories of tasks: Maintenance, Planning/Deployment, Training, Monitoring and Reporting.  The data center is operationa 24 hours a day, 7 days a week. The goal is to expand the use of the facilities to provide additional space for departmental systems across the university to save on power and staff labor and to provide generally higher uptime,

Training

Most training in the data center is impromptu training as needed by new employees that require access to the data center.

Possible training sessions include:

  • General safety in data center
  • Using Internet Protocol Keyboard Video Mouse (IPKVM) remote access to servers and use.
  • Other possible training: Cabling servers, Monitoring sytems (Climate, Servers, Network and Power), Alerts and alarms in data center.

Current Projects

Information Technology is focusing on virtualizing services within the data center. This will allow us to remove multiple servers to lower the current carbon footprint. This also allows us to move  off physical servers that are now out of warranty or upgrade to newer 64 bit platforms from older server equipment.  In this way we can save the cost of purchasing new servers to replace existing aging server hardware. Virtual servers are also quicker to recover from a platform failure resulting in less down time.  Critical systems such as exchange will not be virtualized but some of the componants that are used by exchange will be virtualized.

Saving power to be Green

The main areas of focus on savingpower are cooling, virtualizing servers and power monitoring. By metering power we can measure the power used to  poweri the equipment separate from the power used to cool the equipment. For cooling we have made many changes to improve and make better use of the data center cooling system. We started by creating hot and cold aisles along with ducting the return air from the plenum to each air handler from the hot aisles. Then we continue to seal up any holes used for cabling to force air to be used in the cold aisles where we want it. IT then upgraded the chiller to a multi stage chiller central plant wired the controls so they can better manage the cooling between the dedicated chiller and plant cooling system. The plant cooling system is used as the University Data Center's backup system. Physical Plant is also looking into an automated system to adjust airflow, airhandlers, chiller and plant cooling to assist in lowering the cooling costs of the data center. Currently IT is migrating servers into virtual servers. This is the best cost savings since we can actually turn off systems to save power and reduce the cooling requirements of the data center.