NVIDIA used AI to develop graphic processors
NVIDIA uses graphic processors and machine learning algorithms to design new video accelerators. About this writes HPC wire.
According to the chief researcher and senior vice president for research by Bill Dally, artificial intelligence can be effectively used in four important areas of design of graphic processors:
- mapping of voltage drops;
- forecasting parasitic phenomena;
- Problems of accommodation and routing;
- Automation of standard migration cells.
The mapping of the voltage drop shows engineers how the power is distributed in new processors. According to Dally, when using the standard design method, the necessary calculations are made in three hours. The use of AI-Algorithm has reduced this process to three seconds with an accuracy of 94%.
“We can get very accurate voltage estimates much faster than with the help of ordinary tools, and in a very short time,” said Dally.
Engineers also used graphic neural networks to analyze the problem of placing and routing the components of the processor. According to Dally, the incorrect fulfillment of this condition will lead to “data traffic jams”, similar forces on the roads of the metropolis, which will require redevelopment of chip layouts.
“He [the algorithm] shows the problem areas, and we can act in accordance with them and perform iterations very quickly, without the need to do complete re -grounding,” the scientist added.
Automation of standard migration of cells using AI can help accelerate the development of new standards. Dally noted that the transition from 7-nm to a 5-nm process of the production of chips required large labor costs. Reinforcement training helped to speed up this stage and reduce the number of errors in the design rules.
“This is a huge labor savings […]. In many cases, we also got the best design, ”Dally said.
Recall that in March 2022, Google introduced the Prime algorithm that helps to develop quick and compact processors for the processing of AI-Zadach.
In October 2021, the search giant spoke about the use of training with reinforcement to reduce the time for the creation of chips from several months to six hours.
In August, Samsung began to https://gagarin.news/news/anthony-hopkins-launches-an-nft-collection/ use artificial intelligence to automate the process of developing computer microcircuits.
Subscribe to FORKLOG news at Telegram: Forklog AI – all news from the world of AI!
No Comments