Why And How Modern Chip Design Is Automated
Why And How Modern Chip Design Is Automated
Homepage   /    technology   /    Why And How Modern Chip Design Is Automated

Why And How Modern Chip Design Is Automated

Contributor,John Werner,Tim Danton 🕒︎ 2025-11-03

Copyright forbes

Why And How Modern Chip Design Is Automated

Abstract technology background. Circuit board with a microchips and cyan led backlight. In some of the more sci-fi imaginings of the AI era, there’s this envisioning of “robots making robots,” of software and hardware with generative capabilities essentially replicating itself. But this, it seems, is an overly simplistic way to look at automation that is turned back on the technology itself, and it turns out that the building of semiconductors, at least, has been increasingly automated over the last half a century. The prospect of Electronic Automated Design or EAD has been around since IBM introduced it in the 1950s. But it’s been supercharged by advances in the engineering prowess of automated systems, until, as LLMs emerged, the software could do most of the heavy lifting. Fast forward to today, and the abdication of design mechanics is so profound that human teams are admitting they don’t really know exactly how this design is being done, as covered in this piece by Tim Danton suggesting that we find the new AI designs “weird” and acknowledging that “humans cannot really understand them — but they perform better than anything we've created.” “AI models have, within hours, created more efficient wireless chips through deep learning,” Danton writes, “but it is unclear how their 'randomly shaped' designs were produced.” Business Process Re-Alignment? Or Business Process Transformation? As for the way that designs move through the pipeline, that has changed, too. Take a look at this article at E.E. Times where writer Jon Ferguson covers a shift from the traditional “waterfall” design of a process, to something that’s much more like “devops,” a 21st-century buzzword, with AI in the driver’s seat. MORE FOR YOU This, Ferguson notes, leads to teams with broader ranges of skills, and the tendency to “ship early and iterate often.” But there are lots of other changes, too, not least of which involves the software companies that are designing the hardware for systems that you might call “sentient digital entities.” Doing the Work Here are some highlights from a recent interview where Navin Chaddha of Mayfield asked Anirudh Devgan, CEO of Cadence Design Systems, about how that firm views the market, and the context for this type of change. Devgan has his own history at IBM, and is considered a leader in EAD as these new systems take over parts of the semiconductor manufacturing and design processes. The two discussed the advent of a “silicon Renaissance” where things change quickly, and brought up a culinary, or rather metabolic, metaphor that I thought was an interesting little exchange. “With AI, everything changes, whether it's … the data center, whether it's at the edge, more engineers are coming in,” Chaddha said. “There was a saying, right? ‘Software has eaten the world.’ I think it did, but now, hardware is going to eat it.” “Yeah, I'm not into this ‘eating’ one another,” Devgan responded. “We are a software company, by nature … I think it's obvious, that it's the full stack, right? It's hardware plus software, and both are important.” Baking it In Riffing further on food comparisons, Devgan spoke at length about a “three layer cake” that he considers to be a useful concept for explaining full stack design. “The cake is, you know, the hardware is the bottom layer, which is always there, but it was very specialized,” he said. “Hardware, for the longest time, was primarily driven by x86 you know, like (with) Intel, but now, there is a lot more richness to that hardware, right? … So that's the bottom layer of the cake, and that is going to go through more transformations. The middle layer is what I would call ground truth, based on physics or chemistry or biology, the actual operation of things, which some people forgot. “The top layer is more data-driven, or AI, whether it is agentic AI, or old AI, or new AI.” Hungry yet? “Why do you call it a cake?” Devgan asked rhetorically. “You could call it a stack. The reason I call it a cake is: unless you are a little kid, when you eat a cake, you eat all three layers together. So I think these three have a big interplay with each other: the AI, the ground truth, and the hardware that it runs on.” More on Software Layers As the discussion went on, Devgan covered the intersection of software and biosciences, with more forays into not just human anatomy, but genetics, etc. He characterized a component of design as “computer science plus math,” describing how engineers work this way: “We make software to design chips and systems, but the software is basically mathematical software,” he said. As a market prediction, Devgan suggested that the infrastructure layer will be commoditized, and monetized by big players. He also talked about some of the context for startups to woo investors and companies with influence: “Everybody says they are customer-first,” he said. “Of course, you should be customer-first. Who doesn't want to be customer-first? … those companies can work with anybody they want. They will not work with you unless you have a good product. …You know, there are certain customers which you can see are driving the future, and you can see them now, right?” That Middle Layer – and What Powers Progress Right Now Near the end of the talk, Devgan came back to his cake. “Those three layers are critical,” he noted, “so just be careful that you're not only doing one layer of that, because the hardware will have a big impact, and innovations in hardware could change things, just like robotics. The hardware, the silicon, will be different. … The AI model will be different. And, he said, don’t leave out the layer in between the top and bottom frosting. “The most important thing I see people right now forget is they forget the middle layer,” Devgan continued, calling this median part “super-critical.” “That's where the stickiness is … without knowing the middle layer, it's almost impossible to deliver high value. You can call it intelligence, you can call it data fitting - if you talk to a mathematician, they will call it fitting, but if you talk to marketing people, they'll call it intelligence – the AI has a lot of value, but it has to be combined with the middle layer.” He gave some examples: “If you allow (AI in) a robot or in a self-driving car, there is still a lot of control theory that goes with the AI,” he said. “So I think the main thing is to do something, of course, fundamental, whatever the application is.” Devgan ended with a different metaphor, though, suggesting that in the end, the middle part of the vertical stack will, again, be generic, commodified, portable and probably delivered by big-name vendors. “It's like having a V6 (engine),” he noted. “One (V6) may be slightly better than the other V6, but the verticalization of it will be the value.” It’s all interesting from the perspective of someone who has not researched modern semiconductor design and how things are produced. I found it interesting, too, to note where Devgan, toward the beginning, suggested that users, especially younger ones, don’t even know that devices have chips inside. So it’s a long stretch to expect the average person to know that we are really making these little things with minimal human input. Do humans design them? Only at a higher level. Do humans manufacture them? Not really, no. Keep that in mind, since everything we use will have a chip in it soon. Editorial StandardsReprints & Permissions

Guess You Like

Here's to falling back clocks
Here's to falling back clocks
Daylight saving time ends at 2...
2025-11-01
Lenskart turns around, posts net profit of Rs 62 crore in Q1 FY26
Lenskart turns around, posts net profit of Rs 62 crore in Q1 FY26
IPO-bound eyewear retailer Len...
2025-10-28