Arctic Region Supercomputing Center

Arctic Region Supercomputing Center
Established 1993
Affiliation University of Alaska Fairbanks
Director Dr. Gregory Newby
Administrative staff
30
Location Fairbanks, Alaska, USA
Website arsc.edu

The Arctic Region Supercomputing Center (ARSC) was from 1993 to 2015 a research facility organized under the University of Alaska Fairbanks. Located on the University of Alaska Fairbanks (UAF) campus, the Arctic Region Supercomputing Center (ARSC) offered high-performance computing (HPC) and mass storage to the UAF and State of Alaska research communities. Funding for ARSC operations was primarily supplied by UAF, with augmentation through external grants and contracts from various sources such as the National Science Foundation[1] and Lockheed Martin[2] (through the Department of Defense High Performance Computing Modernization Program.)

In general, the research supported with ARSC resources focused on the Earth's arctic region. Common projects included arctic weather modeling, Alaskan summer smoke forecasting, arctic sea ice analysis and tracking, arctic ocean systems, volcanic ash plume prediction, and tsunami forecasting and modeling.

History

ARSC has hosted a variety of HPC systems. Following is list of various HPC systems acquired by ARSC:

  • 1993 - Cray Y-MP named Denali with 4 CPUs and 1.3 GFLOPS, StorageTek 1.1 TB Silo.
  • 1994 - Cray T3D named Yukon with 128 CPUs and 19.2 GFLOPS.
  • 1997 - Updated Yukon to a Cray T3E 600 with 88 CPUs and 50 GFLOPS.
  • 1998 - Cray J90 named Chilkoot with 12 CPUs and 2.4 GFLOPS, Updated Yukon to a Cray T3E 900 with 104 CPUs, Expanded StorageTek to 330+ TB.
  • 2000 - Expanded Yukon to 272 CPUs and 230 GFLOPS, Updated Chilkoot to a Cray SV1 with 32 CPUs and 38.4 GFLOPS, Doubled StorageTek Hardware.
  • 2001 - IBM SP named Icehawk with 200 CPUs and 276 GFLOPS.
  • 2002 - Cray SX-6 named Rime with 8 CPUs and 64 GFLOPS, IBM P690 Regatta named Iceflyer with 32 POWER4 CPUs and 166.4 GFLOPS.
  • 2004 - IBM P690+/P655+ named Iceberg with 800 CPUS and 5 TFLOPS, Cray X1 named Klondike with 128 CPUS and 1.6 TFLOPS, Mechdyne MD Flying Flex 4 projector Virtual Environment, Two Sun Fire 6800 Storage Servers.
  • 2005 - Cray XD1 named Nelchina with 36 CPUs.
  • 2007 - Sun Opteron Cluster named Midnight with 2312 CPUs and 12.02 TFLOPS, StorageTek SL8500 Robotic Tape Library with 3+ PetaByte capacity.
  • 2009 - Cray XT5 name Pingo with 3456 CPUs, BladeCenter H QS22 Cluster with 5.5 TFLOPS and 12 TB Filesystem.
  • 2010 - Penguin Computing Cluster named Pacman with 2080 CPUs and 89 TB Filesystem, Sun SPARC Enterprise T5440 Server named Bigdipper with 7 Petabyte Storage Capacity, Cray XE6 named Chugach with 11648 CPUs and 330 TB Filesystem, Sun SPARC Enterprise T5440 Server named Wiseman with 7 Petabyte Storage Capacity, Cray XE6 named Tana with 256 CPUs and 2.36 TFLOPS
  • 2011 - Expanded Pacman to 3256 CPUs and 200 TB Filesystem.

Dissolution

ARSC was dissolved as of Nov. 9, 2015, and its systems are now operated by the Research Computing Systems unit at UAF's Geophysical Institute. RCS provides services similar to those that had been offered by ARSC.[3] The RCS unit provides advanced computing, storage, data-sharing solutions and research information technology support to the University of Alaska research community and the state of Alaska.

References

Coordinates: 64°51′36″N 147°50′57″W / 64.8600°N 147.8491°W / 64.8600; -147.8491

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.