Intel announced technological advances that will lead to maintaining the Moore’s law and get to One Trillion (one thousand billion) transistors on a single package by 2030. This is thanks to the technology of packaging 3D, which allows you to increase the density up to 10 times. But also with new materials, including one of the only three atoms thick.
Intel aims for One Trillion transistors by 2030
At the IEEE International Electron Devices Meeting (IEDM) 2022, Intel announced something incredible, right in the 75th anniversary of the transistor. To keep pace with Moore’s Law, which predicts a constant increase in computing capacity, the company aims to bring a trillion transistors into a single package.
Gary Patton, Intel vice president and general manager, Components Research and Design Enablement, explains, “Seventy-five years after the invention of the transistor, the innovation driving Moore’s Law continues to meet the growing demand for computing power. At IEDM 2022 Intel presented not only possible future research developments but also concrete results, both necessary to break down current and future barriers, satisfy this demand and keep Moore’s Law current at all times”.
Intel explains that part of the technology that will lead to this result comes from the new technology of packaging 3D hybrid bonding. It is a process that allows for a 10 times higher interconnect density, resulting in almost monolithic chips.
In addition, it will use new super-thin 2D materials capable of accommodating more transistors on a single chip. Among them, the company demonstrated the fabrication of gate-all-around stacked nanosheets, which they use 2D material only 3 atoms thick instead of silicon. In addition, it achieved ideal switching time at room temperature and low leakage.
This is just one of the technologies the chip company is working on. Even the development of quantum computing, using better qubits, will allow you to advance your computing skills. The journey of the computers of the future has already begun.
Leave a Reply
View Comments