SACPC Hardware

The Centre operates a 128-node CM-5, a 20 processor Power Challenge, a high-capacity tape silo and three visualisation workstations. Each is connected by a 155Mbit/second link to an ATM network.


CM5

Thinking Machines Corporation CM-5

The CM-5 is a high-performance supercomputer composed of processor nodes connected by an active "fat tree" network. Each node has a SPARC processor with local memory and high-performance vector units. The network provides extremely low latency and guarantees a minimum of 5Mbytes/sec between any two nodes, climbing to 20Mbyte/sec for adjacent nodes.

Our CM-5, Colossus, has 128 processor nodes. 32 of these were purchased by the Centre, 32 are on loan from the University of New South Wales, a further 32 were donated by the Australian National University and the final 32 donated by the North Eastern Parallel Applications Centre at Syracuse University. It has 4Gb of physical memory and a 20Gb disk array. The front-end machine is a 150 Mhz HyperSparc SparcServer 20 with 128Mb of physical memory. Colossus has a theoretical peak floating-point performance of 20 GFLOPS.


Power Challenge

Silicon Graphics Power Challenge

The Power Challenge is a shared-memory symmetric multiprocessor supercomputer composed of 64-bit 200MHz MIPS R10000 processors connected by a 1.2Gbyte/sec system bus.

Our Power Challenge, Titan, is a 20-processor machine equipped with 2Gb of physical memory and 77Gb of striped disk storage. Titan has a theoretical peak floating-point performance of 6.4 GFLOPS.


Q47 Tape Silo

Transitional Technologies Inc. Tape Silo

The TTi Tape Silo is a mass storage device based on high-capacity tape drives.

Our Silo, a model Q47 has 2 DLT (Digital Linear Tape) 4000 series drives housed within a tape stacker capable of holding 60 20Gb tapes. Each drive has a throughput of 5Mbyte/second to the host processor, which is a SparcServer 2 attached to the ATM network.


Indigo2

Visualisation Workstations

The Centre operates three visualisation workstations, one at each of the partner universities. Appointments for access to these workstations may be made through the Director, Francis Vaughan, on either 08-303-5592, 0414-726247, or francis@cs.adelaide.edu.au.

Our University of Adelaide workstation, Jeroboam, is a Silicon Graphics Indigo2 with an R4400 processor, 64Mb of physical memory, and a high-performance High Impact graphics system capable of 3D visualisation using supplied stereo glasses. The workstations at the universities of Flinders and South Australia are Silicon Graphics Indys, each with an R4400 processor, 64Mb of physical memory and XZ graphics systems.


???Mbit/sec ATM Network

ATM Network

Our ATM switch is a Fore Systems ASX1000 with a 10Gbit/second backplane, which supports up to 64 OC-3c ports (each running at 155 Mbit/second).

This switch connects Colossus (the CM-5), Titan (the Power Challenge), Jeroboam (the Indigo2 workstation) and the TTi Tape Silo. Further, it provides communications infrastructure to the Distributed High-Performance Computing Project (a project of the Cooperative Research Centre for Research Data Networks), and the DHPC's connection to Telstra's Experimental Broadband Network. Finally, the switch is directly connected to the SAARDNet 34Mb/s ATM backbone, which connects the three universities. Users within the universities can potentially gain access to the centre's facilities at broadband data rates.



SACHPCC