Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Many university scientists are frustrated by the limited amount of computing power available to them for research into artificial intelligence (AI), according to a survey of academics at dozens of institutions worldwide.
How AI-powered science search engines can speed up your research
The findings1, posted to the preprint server arXiv on 30 October, suggest that academics lack access to the most advanced computing systems. This can hinder their ability to develop large language models (LLMs) and do other AI research.
In particular, academic researchers sometimes don’t have the resources to obtain powerful enough graphics processing units (GPUs) — computer chips commonly used to train AI models that can cost thousands of dollars. By contrast, researchers at large technology companies have higher budgets and can spend more on GPUs. “Every GPU adds more power,” says study co-author Apoorv Khandelwal, a computer scientist at Brown University in Providence, Rhode Island. “While those industry giants might have thousands of GPUs, academics maybe only have a few.”
How cutting-edge computer chips are speeding up the AI revolution
“The gap between academic and industry models is huge, but it could be a lot smaller,” says Stella Biderman, executive director at EleutherAI, a non-profit AI research institute in Washington DC. Research into this disparity is “super important”, she says.
To assess the computing resources available to academics, Khandelwal and his colleagues surveyed 50 scientists across 35 institutions. Of the respondents, 66% rated their satisfaction with their computing power as 3 or less out of 5. “They’re not satisfied at all,” says Khandelwal.
Universities have varying set-ups for GPU access. Some might have a central compute cluster shared by departments and students, where researchers can request GPU time. Other institutions might purchase machines for lab members to use directly.
Some scientists said that they had to wait days to access GPUs, and noted that waiting times were particularly high around project deadlines (see ‘Computing shortage’). The results also highlight global disparities in access. For example, one respondent mentioned the difficulties of finding GPUs in the Middle East. Just 10% of those surveyed said that they had access to NVIDIA’s H100 GPUs, powerful chips designed for AI research.
This barrier makes the process of pre-training — feeding vast sets of data to LLMs — particularly challenging. “It’s so expensive that most academics don’t even consider doing science on pre-training,” says Khandelwal. He and his colleagues think that academics provide a unique perspective in AI research, and that a lack of access to computing power could be limiting the field.
Nuclear power for AI: what it will take to reopen Three Mile Island safely
“It’s just really important to have a healthy, competitive academic research environment for long-term growth and long-term technological development,” says co-author Ellie Pavlick, who studies computer science and linguistics at Brown University. “When you have industry research, there’s clear commercial pressure and this incentivizes sometimes exploiting sooner and exploring less.”
The researchers also investigated how academics could make better use of less-powerful computing resources. They calculated the time it would take to pre-train several LLMs using low-resource hardware — with between 1 and 8 GPUs. Despite these limited resources, the researchers were able to successfully train many of the models, although it took longer and required them to adopt more efficient methods.
AI models fed AI-generated data quickly spew nonsense
“We can actually just use the GPUs we have for longer, and so we can kind of make up for some of the differences between what industry has,” says Khandelwal.
“It’s cool to see that you can actually train a larger model than many people would have assumed on limited compute resources,” says Ji-Ung Lee, who studies neuroexplicit models at Saarland University in Saarbrücken, Germany. He adds that future work could look at the experiences of industry researchers in small companies, who also struggle with access to computing resources. “It’s not like everyone who has access to unlimited compute gets it,” he says.