Locked amongst the electronics and cables in USC’s datacenter on Wheat Street is a supercomputer that whirs away, computing answers to the university’s biggest questions.
The computer, justly named Hyperion after the titan of light and sun from ancient Greek mythology, is nearly 24 feet of seven-foot-tall black, boxy computer racks. There is a ceaseless hum of fans, blinking lights and whoosh of air from the several industrial air conditioning units pumping under the reinforced floor.
This near-six-tons of technology is a powerful and rare machine, but it also represents the future with its vast pursuits. As a product, supercomputer technology — also known as high-performance computing, or HPC — has begun to be both a commodity and a necessity for research.
Hyperion offers a free solution for any kind of computational problem to the entire university, said Nathan Elger, the HPC team systems architect and a founding member of the research computing team that manages Hyperion.
“It kind of runs the gamut,” Elger said. “HPC is a really weird space of IT where we are always at the forefront of the next big thing.”
Hyperion has calculated mosquito population numbers, contributed to COVID research, spent days on astronomy problems, calculated spin orbits of electrons, helped students to program virtual reality mapping of human MRI scans, studied philosophy, performed wastewater analysis and even helped identify and catalog shards of ancient pottery from Native Americans.
It never shuts off completely, not even for maintenance, and is accessed almost exclusively remotely, Elger said.
It can do these tasks because of the immense computing power from, in layman’s terms, its combined set of computers called nodes connecting to make a cluster of resources.
The average laptop typically has around four to eight cores. Hyperion has 15,524 cores. The average laptop has typically around 200 gigabytes of storage and less than 16 gigabytes of memory. Hyperion has millions of gigabytes in storage, graphical processing and memory.
The only other supercomputer in South Carolina is located at Clemson University. For universities and industries alike, this technology isn’t new, and it has been at the forefront of science since its conception. Hyperion’s first of two generations was built in 2017, costing USC more than $4 million combined.
Hyperion is also so advanced that it is a struggle to keep it that way.
Paul Sagona is the executive director and a founding member of the HPC team. He describes this balance of being at the forefront of science as exponentially demanding from both the team and the tech industry.
“We’re not on the cutting edge of technology, we are on the bleeding edge of technology,” Sagona said. “These industries are being pushed to go faster and faster, do whatever we can, whatever they can, to do larger-scale problems.”
For Sagona, it’s exciting being on this “bleeding edge.” But it is a challenge when the technology can become outdated in a matter of months. Hyperion's benchmarks are measured almost every six months, he said.
The team even signs non-disclosure agreements with tech companies for a preview of their newest products to stay on top of changes, Sagona said.
When Sagona and his team started, they looked after a 20-node cluster. Today, this flagship cluster has over 400 nodes and has more than quintupled in compute power, Elger said.
“It's going at lightspeed right now,” he said. “Basically, every discipline is using GPU (graphical processing unit) compute and an AI (artificial intelligence) to accelerate or to explore new areas of their research.”
The technology has become so accessible that vendors such as Dell or Hewlett-Packard sell it basically preassembled, leaving only the challenging task of maintaining it efficiently, Elger said.
The job is always developing and so is the computer, with the right funding. Major updates for the computer are already in the works for the coming years. According to Sagona, the team is exploring potential new ways to improve the cluster through cooling methods, using liquids, and security.
But Hyperion’s true potential might be the role it will play for later generations, Elger said.
One example is Sadman Sadeed Omee, a first-year doctoral student researching deep-learning methods. With the computer being accessible anywhere because of its remote system, he was able to use it from his home in Bangladesh before he moved to South Carolina. He said having access to such high-quality hardware was rare in his developing country, and it helped overcome a barrier to his research.
Today, he is one of the most frequent users of the machine, and he probably would not have had the time to complete many of his research goals without it, he said.
For Professor Jianjun Hu, an associate professor and graduate recruiting director for USC’s Department of Computer Science and Engineering, Hyperion has facilitated the discovery of completely new materials, which may have taken years otherwise. Hu’s material discovery research is one of only two comparable in the country, the other being at MIT, he said.
Hu’s work is an example of how this technology has almost revolutionized the paradigms of research and scientific breakthroughs, he said. He’s also never visited Hyperion — he doesn’t need to.
“Essentially every discipline — medical engineering, science, public health — they all need this supercomputer to do the leading research,” Hu said.
Herein lies the potential of Hyperion and all supercomputers. AI and machine learning are causing most major breakthroughs on long-term questions, which is enabled through supercomputers.
“Many of the breakthroughs depend on AI, and it depends on the supercomputer,” Hu said.