Connect with us

Science

Edinburgh Researchers Unveil Tech to Boost AI Speed Tenfold

Editorial

Published

on

Researchers at the University of Edinburgh have developed a groundbreaking system that could enhance the speed of artificial intelligence (AI) data processing by a factor of ten. This advancement is expected to significantly impact industries reliant on large language models (LLMs), such as finance, healthcare, and customer service applications like chatbots.

The innovation is rooted in the use of wafer-scale computer chips, which are the largest of their kind and comparable in size to a medium chopping board. These chips can perform multiple calculations simultaneously due to their expansive on-chip memory and design. Currently, most AI systems utilize graphics processing units (GPUs) that are interconnected in networks, requiring data to traverse between different chips. This method can limit performance and efficiency.

WaferLLM: A New Era for AI Processing

The team at the University of Edinburgh has introduced a software solution specifically designed for wafer-scale chips, called WaferLLM. In tests conducted at the EPCC, a supercomputing centre at the university, the third-generation wafer-scale processors demonstrated a tenfold increase in response speed compared to a cluster of 16 GPUs. Additionally, these chips were found to be considerably more energy-efficient, consuming about half the energy of GPUs while operating LLMs.

According to Dr. Luo Mai, the lead researcher and reader at the university’s School of Informatics, the potential of wafer-scale computing has been remarkable, yet software limitations have hindered its practical application. He stated, “With WaferLLM, we show that the right software design can unlock that potential, delivering real gains in speed and energy efficiency for large language models.”

The research was peer-reviewed and presented at an operating systems and design symposium in July 2023. Professor Mark Parsons, director of EPCC and dean of research computing at the university, emphasized the significance of the findings, noting, “Dr. Mai’s work is truly ground-breaking and shows how the cost of inference can be massively reduced.”

Future Implications and Open-Source Accessibility

The implications of this research extend beyond immediate performance improvements. The team has made their findings available as open-source software, allowing other developers to create applications that leverage wafer-scale technology. This collaborative approach could foster innovation across various sectors that depend on rapid data analysis and real-time intelligence.

As industries increasingly rely on AI for critical decision-making processes, the advancements made by the University of Edinburgh could pave the way for a new generation of AI infrastructure capable of meeting the demands of modern society. The potential applications in science, healthcare, education, and everyday life are substantial, marking a significant step forward in the field of artificial intelligence.

Our Editorial team doesn’t just report the news—we live it. Backed by years of frontline experience, we hunt down the facts, verify them to the letter, and deliver the stories that shape our world. Fueled by integrity and a keen eye for nuance, we tackle politics, culture, and technology with incisive analysis. When the headlines change by the minute, you can count on us to cut through the noise and serve you clarity on a silver platter.

Continue Reading

Trending

Copyright © All rights reserved. This website offers general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information provided. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult relevant experts when necessary. We are not responsible for any loss or inconvenience resulting from the use of the information on this site.