Google has used the model to design its next generation of tensor processing units (TPUs), which run in the company’s data centres to enhance the performance of various AI applications
In what could be dubbed as a new era of chip design, a team of Google researchers is designing next-generation artificial intelligence (AI) chips. As per reports, the team has created an AI model that allows chip design to be performed by artificial agents with more experience than any human designer.
To achieve this, the new AI method utilises past experience to become better and faster at solving new instances of the problem.
The model could generate a design that optimises the placement of different components on the chip in about six hours. For this, the Google team used a dataset of 10,000 chip layouts for a machine-learning model, which was then trained with reinforcement learning.
In a paper that appeared in the scientific journal Nature, the team wrote, “Our method was used to design the next generation of Google’s artificial intelligence (AI) accelerators, and has the potential to save thousands of hours of human effort for each new generation.”
“Finally, we believe that more powerful AI-designed hardware will fuel advances in AI, creating a symbiotic relationship between the two fields”, they added.
Google has used the model to design its next generation of tensor processing units (TPUs), which run in the company’s data centres to enhance the performance of various AI applications.
Despite five decades of research, chip floor planning has defied automation, requiring months of intense effort by physical design engineers to produce manufacturable layouts.
“Our RL (reinforcement learning) agent generates chip layouts in just a few hours, whereas human experts can take months,” Anna Goldie, Google Brain’s research scientist who took part in the research, said in a tweet.
“In under six hours, our method automatically generates chip floor plans that are superior or comparable to those produced by humans in all key metrics, including power consumption, performance and chip area,” according to the Google AI team.