AI is everywhere. It underpins the tools you use in your workplace, it determines the deals offered to you when you go grocery shopping, the customer service representative you reach out to is probably a chatbot, and it is increasingly trying its hand at journalism (although not in this case).
Another area where AI is becoming increasingly pervasive is in the chipmaking industry. Currently, it can take between 18 months to two years to design a chip, and as compute requirements increase, it’s a process that is becoming increasingly costly and time-consuming.
The manufacturing process, while less time-consuming, is no less complex and can involve hundreds of steps, making the shift from design to mass production deeply laborious. It’s therefore unsurprising that chip companies have started to dip their toe into the AI pool to see if the technology can bring efficiencies to the industry.
When it comes to chip design, Alan Priestley, VP analyst at Gartner, said that on a very simplistic level, there are a number of things to consider, namely what you want the chip to do, which is reliant on the function of logic blocks; the layout of the chip and the conversion of those logic blocks to transistors on the silicon surface; and then testing and validating the chip to ensure it does what is intended of it.
Within almost all of these steps, he said that AI tools could be theoretically deployed to speed up the design process, such as automating layout optimization tasks like floor planning and routing, or simulating the behavior of a chip in different scenarios, thus reducing the need for physical prototypes.
Using emerging technologies to assist with chip design is not a new concept Priestly argues, noting that the technology used to develop chips today is incredibly complex by comparison to that of yesteryear.
Priestly, who also has an engineering degree and worked at International Computers and Intel in the past, has some hands-on experience with how things have changed.
“If you go back in time, the first chip I ever designed was drawn on paper by hand, and when they made the masks, that was all cut out by hand. So, we've already added computer technology to assist with stuff like the layout design.”
Referencing modern-day CPUs, which can contain anything from a few million to billions of transistors, Priestley adds: “You can't design a current-generation chip using the techniques that were out there 30 or 40 years ago; you need sophisticated computer technology to design it today. Adding AI techniques into that design process is just the next step.”
In July 2023, speaking at the World Artificial Intelligence Conference (WAIC) in Shanghai, AMD’s CEO, Dr. Lisa Su, said that the company had already started to use artificial intelligence for designing chips, adding that she expected that AI-enabled tools would eventually dominate chip design as the complexity of modern processors continues to increase exponentially.
The company’s interest in AI-powered chip design was elaborated upon by Andrej Zdravkovic, senior vice president of GPU technologies and engineering software and chief software officer at AMD, in comments to DCD.
“AMD engineering teams are always looking for new ways to use cutting-edge technology in our design process. We have used predictive AI models for many years now and they have proved to be extremely useful in recognizing patterns and helped us improve productivity and reduce development time,” he said.
Zdravkovic explained that AMD has been deploying these models to help identify potential issues early in the hardware and software design process, providing the company with additional tools to make informed decisions.
“With the development of large language models and explosion of the generative AI we have started to look at integrating AI into our workflow for the silicon and software design process to more efficiently deliver faster and more innovative designs,” he said, adding that AMD is also identifying how AI can help automate and optimize the repetitive tasks including checking and correcting the RTL or software code for best practices, architecture and security standards.
The role of EDA
If you look beneath the hood, in a lot of use cases AI is just being used to automate undeniably dull tasks. In that regard, AI-powered chip design is not a wholly new concept.
Electronic Design Automation (EDA) companies have been around for decades, with the earliest EDA processes being attributed to IBM in the 1950s. However, the continuous scaling of semiconductors is making them increasingly popular with chip manufacturers.
Founded in 1986, Synopsys is one such EDA company that supplies tools and services to semiconductor manufacturing companies. It took its first foray into the world of AI-powered chip design in 2020 when it launched a cloud-based AI software tool called DSO.ai, which uses reinforcement learning to automatically decide how best to place and route blocks of circuitry on a chip.
Previously, the ability to explore the design space without the use of compute was not possible because time limitations meant there were only so many iterations it was humanly possible to run, explains Arvind Narayanan, executive director of product line management, at Synopsys EDA Group.
However, the company realized that by bringing AI into the process, it would provide its customers with the ability to run millions of combinations within the design space in a reduced time, allowing designers to achieve higher performance, lower power consumption, and smaller chip area with less manual effort.
Following the release of DSO.ai, in March 2023, the company launched Synopsys.ai, an AI-powered design automation suite that deploys generative AI across the entire EDA stack. This suite includes VSO.ai for verification, TSO.ai testing, and ASO.ai for analog design, with the company essentially taking the architecture it used to build its DSO.ai offering and scaling and optimizing it for different workflows and parameters.
Synopsys’ full suite of AI-powered tools now forms an end-to-end solution that includes system architecture, design capture, verification, implementation, sign-off, testing, and silicon manufacturing.
The company has since partnered with AMD, Intel, and Nvidia, and counts organizations that design everything from chips for high-performance computing, to AI, mobile processors, automotive, and electronics among its customer base.
The challenges facing chip designers
Priestley said that EDA companies have traditionally partnered with smaller companies, as bigger organizations already have their own in-house development capabilities.
However, Narayanan said that the challenges currently facing the semiconductor industry are arguably more complex than ever before, meaning that companies of all sizes and from all industries are now turning to EDA organizations and AI tools to help them address some of these issues.
Most conversations relating to the biggest challenge facing the industry will involve a discussion on how much smaller chips can get. Currently, the smallest chips in production are 3nm but the race to mass-produce 2nm is officially underway, with availability expected from 2025.
However, traditional lithography processes – the action of creating minute patterns on a silicon wafer – are reaching their limits because, as transistors have grown smaller, this process has required increasingly complex calculations to work out how to operate at such a small scale.
One company that has tried to tackle this problem is Nvidia, which launched cutLitho, a software library for computational lithography that improves chip design development times, at its GTC conference in March 2023.
At launch, the company said cutLitho would facilitate the development of chips with tinier transistors and wires than is currently achievable, while accelerating time to market and improving the energy efficiency of the data centers that run alongside semiconductor fabs as part of the manufacturing process.
At its 2024 conference, Nvidia said its cutLitho library was now being used by TSMC and Synopsys for production chip development. The company also claimed that generative AI had allowed cutLitho to provide a 2× boost in performance on top of the accelerated computing it originally offered; which itself was a performance leap of up to 40× beyond current lithography.
Separately, in October 2023, Nvidia announced it had developed ChipNeMo, a large language model (LLM) to help its employees design chips.
Another challenge facing the semiconductor industry that AI could help to address is the talent shortage.
As is the case across most of the technology sector, the semiconductor industry is desperately lacking talent, with a 2022 report from Boston Consulting Group (BCG) titled ‘The Growing Challenge of Semiconductor Design Leadership’ stating that at the current rate of growth, demand for design workers in the semiconductor industry will exceed supply by nearly 35 percent in 2030.
Consequently, BCG notes that design leaders must leverage “new and future technologies” that are critical for design innovation, including AI.
“Designers can more quickly and effectively meet power, performance, and area targets by leveraging AI-based tools. Reinforcement learning and other AI algorithms can automate less consequential design tasks, freeing engineers to focus on more advanced tasks and decisions,” the report reads.
Narayanan said that AI has already made significant inroads when it comes to chip design but he believes there is so much more that can be achieved with the technology, not just in terms of workflow optimization but through the wider benefits it can bring to the industry.
Narayanan references the BCG report and says that the talent shortage is a very real issue that Synopsys’ customers are currently grappling with.
“They see that there is so much potential [for AI] in improving productivity and that's the direction most of the companies are headed,” he said.
“[Chip companies] all have the same set of challenges. What they’re designing doesn't matter, they all have the same set of challenges. So, when you have technology that can increase the productivity of your existing workforce, it would be foolish not to look at it and adopt it.”
“I don't think [AI is] a difficult sell. They're all sold on it already,” Narayanan said.
Will the benefits of AI outlive the hype?
As is the case with most emerging technologies, it can be hard to judge if its longevity will outlive its hype.
For now, it appears that the AI bubble won’t burst any time soon however, that doesn’t mean that all caution should be thrown to the wind just yet.
Gartner’s Priestly is more pessimistic than most, saying that we’re reaching the peak of the hype cycle and as is often the case, it could very much all come crashing down.
While he acknowledges that AI will undoubtedly accelerate the design process, he warns that the technology is like a black box and therefore the challenge lies within the outcome. He says, that as numerous examples of questionable content created by generative AI have proven, you can’t always guarantee that the requests you make will generate the correct response.
“That’s the big issue here because chip design is expensive,” Priestly said. “Converting a theoretical design to practicality costs hundreds of millions of dollars so you need to do as much work as possible upfront so you can be certain that everything works and there are no surprises. Especially as the more complex the chip is, the harder it is to test it.”
AMD’s Zdravkovic echoed Priestly’s point that AI will improve the design process by accelerating the speed at which the design process can be completed but warned that human interaction should remain part of the solution, as chip design requires an “in-depth understanding of the complete design space and well-defined interaction and dependency of all system parameters.”
Looking to the long-term, however, his comments reflected those Su made in 2023, saying that while recent advances in AI and new capabilities offered by AI are nothing short of phenomenal, it feels like the industry has just started to scratch the surface.
“I firmly believe that over the years we’ll move most of the repetitive design work to advanced AI tools, while freeing up our engineers to do creative tasks, and invent new advanced silicon and software architectures,” he said.
For Narayanan, the glass isn’t half full but rather overflowing. He said that amongst Synopsys’ customers at least, where there might once have been some skepticism (as is often the case with any new technology) that seems to have evaporated as customers have seen the value AI can bring to chip manufacturers.
He said that organizations are now realizing that they can do even more without having to grow their workforce, and can accomplish these growing workloads in the same amount of time or even shorter windows of time.
That's where he says Synopsys has seen AI play the most critical role, bridging the gap between what customers need to do and what they can do.