The end of Moore's Law is fast approaching. Engineers and designers can only do so much in miniaturizing transistors and packing as many transistors as possible into a chip. Therefore, they are turning to other chip design methodologies, incorporating technologies such as AI into the process.
For example, Samsung is adding artificial intelligence to its memory chips for processing in memory, thereby saving energy and accelerating machine learning. Speaking of speed, Google's TPU V4 AI chip offers twice the processing power of its predecessors.
But artificial intelligence still holds greater promise and potential for the semiconductor industry. To better understand how AI can revolutionize chip design, we interviewed Heather Gorr, Senior Product Manager for the MATLAB platform at MathWorks.
Q: How is AI currently being used to design next-generation chips?
Heather Gorr: Artificial intelligence is a very important technology because it's involved in most of the cycle, including the design and manufacturing processes. There are many important applications, even in general process engineering where we want to optimize. I think defect detection is an important task at all stages of the process, especially in manufacturing. But even when considered early in the design process, [AI is playing a significant role now] when you're designing lights, sensors, and all the different components. You really need to consider a lot of anomaly detection and fault mitigation.
Then, considering the logistical models you see in any industry, there's always planned downtime you want to reduce; but you'll also eventually encounter unplanned downtime. Therefore, by looking back at historical data—when you've experienced situations where manufacturing something might take a little longer than expected—you can examine all that data and use AI to try and identify recency causes or see what might have jumped even during the processing and design phases. We often think of AI as a predictive tool, or a robot that does something, but often you gain a lot of insights from data through AI.
Q: What are the benefits of using AI for chip design?
Gorr: Historically, we've seen a lot of physics-based modeling, which is a very intensive process. Instead of solving such a computationally intensive and extensive model, we wanted to do a reduced-order model; we could do something cheaper. You can create a surrogate model, essentially a physics-based model, use the data, and then use the surrogate model for parameter sweeps, optimization, and Monte Carlo simulations. This takes far less computational time than directly solving the physics-based equations. Therefore, we see this benefit in many ways, including the efficiency and cost-effectiveness of rapid iterative experiments and simulations that truly aid in design.
Q: So in a sense, it's like having a digital twin?
Gorr: Exactly. That's pretty much what people are doing. You have a physical system model and experimental data. Then, combine them, and you have another model that you can tweak and tweak, trying different parameters and experiments, sweeping through all those different cases and ultimately coming up with a better design.
Q: So, it will be more efficient, and, as you said, cheaper?
Gorr: Yes, absolutely. Especially in the experimental and design phases, you're trying out different things. If you were actually manufacturing and producing [chips], that would obviously save significant costs. You want to simulate, test, and experiment as much as possible without having to use actual process engineering to build things.
Q: We've already discussed the benefits. What about the drawbacks?
Gorr: [AI-based experimental models] are often less accurate than physics-based models. That's why you do so many simulations and parameter scans, of course. But that's also the benefit of having a digital twin—you can keep that in mind—it won't be as accurate as the precise models we've developed over the years.
Chip design and manufacturing are both systems-intensive; you have to consider every single part. It's really challenging. In this case, you might have models to predict certain things and their different parts, but you still need to put them together.
Another thing to consider is that you need data to build the model. You have to integrate data from various different sensors and different types of teams, which adds to the challenge.
Q: How can engineers use AI to better prepare for and extract insights from hardware or sensor data?
Gorr: We've been thinking about using AI to predict things or do some robotic tasks, but you can use AI to figure out patterns and pick out things you might not have noticed before. People use AI when they have high-frequency data from many different sensors, and it's often useful to explore the frequency domain and things like data synchronization or resampling. These can be very challenging if you're unsure where to start.
One thing I want to emphasize is to use the tools available. There's a huge community working on this, and you can find tons of examples [of applications and technologies] on GitHub or MATLAB Central where people share great examples, even small applications they've created. I think many of us are immersed in data but unsure how to process it, so definitely take advantage of what's already in the community. You can explore and learn what makes sense to you and strike a balance between domain knowledge and the insights you gain from tools and AI.
Q: What should engineers and designers consider when using AI for chip design?
Gorr: Think about what problem you're trying to solve, or what insights you might be hoping to find, and try to figure that out. Consider all the different components and document and test each different part. Consider all relevant people and explain and hand over the information in a way that is reasonable for the entire team.
Q: How do you think AI will affect the work of chip designers?
Gorr: It will free up a lot of human capital for more advanced tasks. We can use AI to reduce waste, optimize materials, and improve design, but human involvement is still needed when making decisions. I think this is a great example of people and technology working together. It's also an industry where everyone involved—even on the manufacturing floor—needs to have some level of understanding of what's going on, so it's a great industry for advancing AI because of how we test things and how we think about them before we put them on a chip.
Q: What are your views on the future of artificial intelligence and chip design?
Gorr: It largely depends on the human factor—getting people involved in the process and having an interpretable model. We can do a lot with the mathematical details of modeling, but it depends on how people use it, how everyone in the process understands and applies it. Communication and participation from people of all skill levels will be crucial in this process. We will see less hyper-precise predictions and more transparency in information sharing and digital twins—not just using artificial intelligence, but also using our human knowledge and all the work that many people have done over the years.