We understand what a technical process is and why fewer nanometers are not always better.
For many users, the main characteristics of any processor are still the number of cores used in it and its clock frequency. Of course, this is partly true; a lot depends on them. But in addition to this, another equally important parameter directly affects the performance and energy efficiency of the processor – the technological process.
Today, we will discuss what a technical process means and dispel a few erroneous myths associated with this concept. But for a better understanding of the situation, it is worth starting a little from afar.
A bit of theory
Any processor consists of many transistors – switches, which can be in one of two positions – 1 and 0. When current passes through the transistor, we have one at the output with no current – zero. This is where all the low-level programming languages that directly operate with zeros and ones came from.
In the fifties of the last millennium, an ordinary vacuum light bulb played the role of a transistor, which is why the first weak computers with several thousand transistors occupied entire rooms. The revolution took place in the early 60s when the first field-effect transistors were born.
The basis of any transistor is silicon. On it are applied two layers of the conductor, remote from each other – input and output. Since the conductors are at some distance from each other, when voltage is applied to the input, the output remains “0” (no current). For the current to pass from one conductor to another, another insulated conductor is applied to the silicon substrate; this time, let’s call it a gate. By itself, the gate will not be able to transfer current from the input of the transistor to the output (remember that it is isolated). Still, when current is applied to it, an electric field is created around the gate, allowing current to flow from the input of the conductor to the output. In this case, the transistor goes into position “1” (current is present).
Each year, the size of transistors became smaller, and the density of their placement on the crystal increased. But as the size of the transistors decreased, a moment came when the gate could no longer block the current from input to output – electrons passed through it. And it was at this moment. Another revolution took place in semiconductors – the place of planar or flat transistors was taken by three-dimensional ones. The conducting channel was raised above the silicon substrate. Because of this, the shutter already wraps around it on three sides, resulting in better control of the current. This transistor structure is called FinFET, and it is its use has helped manufacturers continue to shrink transistors and increase their density to previously unheard-of values.
Moore’s Law and Why Downsize Transistors
In 1975, Intel founder Gordon Moore made an empirical observation called Moore’s Law. According to him, the number of transistors on a chip doubles every 24 months. But why bother increasing the density and downsizing of transistors?
A processor with several thousand transistors is weaker than an 11 million one. But besides the obvious increase in performance, reducing the size of the transistor also improves its energy efficiency: the smaller the transistor, the less current it takes to operate. Reducing the size of the gate reduces the time required to switch the transistor from one state to another – it starts to work faster.
What is a technical process?
Once upon a time, the technical process was understood as the size of the transistor gate, i.e. with 32-nm process technology, the gate length was thus equal to 32 nm. But it was with 32 nm that manufacturers stopped adhering to this rule, and the very concept of a technical process has largely turned into a marketing ploy.
Of course, there is an opinion that not everything is so bad. For example, you can often come across the statement that after separating the concepts of “gate length” and “technical process”, the latter turned out to be directly tied to the already mentioned Moore’s law. Since the number of transistors on a crystal doubles every two years, then the size of the transistor is halved; that is, the side of such a transistor is reduced by 0.7 times.
This was probably the last attempt by manufacturers to streamline the concept of a technical process somehow. It has turned into a kind of marketing ploy that has nothing to do with real numbers. Moreover, one manufacturer’s 10nm process technology can be radically different from what another chipmaker understands.
For example, TSMC’s 10nm process uses 66×42 nm transistors versus 54×44 nm for a similar Intel process. And Intel’s 10nm process technology is comparable to TSMC’s 7nm process. More precisely, it is comparable to it in terms of the size of transistors. Therefore, progress in the number of nanometers can only be considered within one company’s products.
At the same time, the logic of “less is more” can also become erroneous. The thing is that with an increase in the density of a crystal at the current level of technology, its heating can increase. The consequence of this is throttling and a serious decrease in performance. Such crystals are good for sprinting but not suitable for marathons, i.e. long-term loads. This is fully true for platforms such as the 4nm Snapdragon 888 and 5nm Samsung Exynos 2100. They are undoubtedly much more efficient than their predecessors at short distances, but under prolonged load, the advantage of these SoCs melts away due to severe overheating and throttling.
Although process technology and nanometers have become a marketing concept lately, they still measure progress across products from a single manufacturer. In other words, in most cases, the smaller the technical process used in the manufacture of the processor, the more transistors it contains and the greater its final performance.
However, it is important to understand that this is fully relevant only for the maximum performance of the processor, and the speed of its operation at a long distance also depends on how successful the microarchitecture used in it is, whether the processor overheats and whether it has problems with throttling.