WILLIAMSBURG — When consumers think of what Graphics Processing Units (GPUs) are used for, at the top of the list are usually video games, maybe movies, or perhaps other forms of digital entertainment like the “up-and-coming” Virtual Reality (VR) industry. However, it may come as a surprise that GPUs are being used at a far wider range that go beyond the realm of the entertainment industry.
Some of these use cases include developments in Artificial Intelligence (AI), CryptoCurrency Mining, new forms of smart/security Cameras, and for a more recent development: studying COVID-19.
Advanced Micro Devices, also known as AMD, is one of the industry leaders in GPUs. The tech company competes with Intel and Nvidia for market share. AMD has helped twenty-three organizations from the health- and bio-industry by providing researchers with training to learn their GPU devices and platform so that they can come up with solutions for vaccine development, genetic sequencing, and modeling of the outbreak.
AMD partnered with a group of instructors including local William & Mary (W&M) Professor Yifan Sun to help teach researchers how to use the GPUs as an effective tool for their research.
“GPUs are originally designed for gaming, but nowadays they are a very important device for general-purpose computing like artificial intelligence, machine learning, data processing, physical simulations, signal processing, or cryptocurrency mining,” said Sun. “Any place where you need a lot of computing power you have to use GPUs.”
The health industry is using GPUs to run simulations about the pandemic. Some study and try to predict how many people might get infected if they take certain actions. There are also experts who are working on vaccines and other types of biological research. With GPUs, medical researchers can simulate how spiked proteins may impact human cells. This kind of medical research requires a lot of computational power like the kind of power found in GPUs.
“Nowadays if you buy a CPU [Central Processing Unit], it’s 4-cores, 8-cores, 32-cores at most. That means you can run 72 operations at the same time if there are 32-cores,” said Sun. “GPUs can easily have thousands of cores. This is the design difference. However, those cores are not as powerful as a CPU core so it’s actually much harder to write a GPU program compared to a CPU program. It’s much harder to debug, it’s also much harder to tune the performance, and you probably have to write for a specific device.”
That’s one of the reasons why researchers require training to properly use GPUs. The use of high computational power to combat COVID-19 has been an international effort. Researchers have come from India, France, Germany, Canada, and the United States.
“If in the future everyone knows how to use this technology then whenever a new virus comes out we can get the information about the virus much quicker and the government can take quicker action,” said Sun. “The breakout started in December of last year, but it was only in March that the U.S. government started the lockdown. If we have better computational power we can probably get the information much earlier and probably make the decision much earlier. There are a lot of politics involved in these decisions, but the thing that I can help with is with making faster decisions.”
Sun says that these types of studies, the ones that involve high levels of computational power, have been widely used by pharmaceutical companies so they can produce either therapies, drugs, or vaccines faster. It’s also currently being used in cancer research.
At William & Mary, Sun teaches both an undergraduate and graduate-level course. The undergraduate course is called Computational Problem Solving. It’s a course where students learn how to use the Python Programming language to solve real-world problems.
“We want to teach students how to solve real-world problems with computers. So computers can basically automate a lot of things so you don’t need to manually calculate,” Sun said.
His graduate course is a Ph.D.-level course called Computational Architecture.
“It’s a course for basically learning how to build computer chips,” said Sun. “It can be GPUs, it can be CPUs, or it can be other types of chips. We basically learn how these chips are organized and how these chips work. We learn how to make them run faster, more reliable, and how to make them more secure.”
Sun is currently working on a book about the “HIP program language” which is what one needs to know in order to program with AMD’s graphics cards.
More information on William & Mary’s computer science program can be found on the program’s website.