It is an open secret in the semiconductor industry that the end of the so-called Moore's law is imminent, or may already be here, but it has been hard to find public data to support it as no semiconductor vendor wants to acknowledge it, although it is being widely talked about at industry conferences.

While perusing the latest thinking from Accenture's technology strategists, I came upon an interesting tidbit in a report called "Moore or Less?": the real manufacturing costs for different semiconductor nodes. Presumably, the data comes from NXP Semiconductors, which Accenture lists as a strategy consulting customer. NXP has used several of the major foundries in the past (TSMC, Globalfoundries and Samsung), so the numbers should be representative.

Accenture opens up the blog post with a deck proclaiming that:

Semiconductor companies can keep pace with Moore's Law. The question is whether they should.

seeking to lull our anxiety about any imminent death. Yet, further on in the PDF version of the report, a figurative Atropos makes an appearance in Figure 2, which shows the projected average cost per gate over time (65 nm to 16 nm nodes). The cost per thousand gates is estimated to be $29 at 65nm, $19 at 45nm, $14 at 28nm, followed by an increasing $15 at 20 nm and $16 at 16 nm. This shows clearly that further miniaturization is not economically self-driving anymore, effectively cutting off further investment in traditional semiconductor technology. It may still be the case that the total cost of ownership (TCO) for a given unit of computation for a semiconductor end user is about the same, given that denser semiconductors are more power efficient, but since power costs are still not the major part of TCO for computing, the conclusion still stands.

The direct consequence of this development is that today, the only motivation for pursuing denser manufacturing nodes is the smaller form factor by itself, for example, in mobile devices such as phones. But this compactness comes at a premium cost, with heavy investment in assets and R&D. It is likely that the only product categories where this will be meaningful are extremely high-volume consumer products, such as the iPhone, where the high fixed costs can be amortized over enormous sales volumes. Or alternatively, that the lifetime of the products and the manufacturing lines have to be extended several more years in order to pay back the high costs.

Further along, during the 2020s, I expect the following changes in the IT industry. I am assuming here that no radically new semiconductor and/or chip technology is introduced on the market (which still there is no sign of, quantum computing included):

  • Longer investment cycles for hardware. This will have interesting second order effects, as you can afford to buy more hardware for a set budget, given that they are depreciated over longer time. Financial leverage and debt will rise due to this (as you need to take on more debt to finance the hardware investments). At first, it may be interpreted as a bright future for tech, with lots of activity, even though the underlying forces are far from favourable. Initially, it will look very good for the whole hardware industry with several years of record sales, but it will be followed by a bust.

  • Power/cooling will be an even bigger part of TCO since the hardware cost goes down due to a longer projected lifetime. If a typical server installation runs for 10 years instead of 5 years, the share of operating costs in the TCO will double. This draws more attention on power efficiency as the only means to improve TCO.

  • Further commoditization of the IT infrastructure as a result of the longer investment cycles. This will accelerate the migration to the cloud. As computing becomes a more mature technology, not unlike electricity generation, it will start to position itself more like a utility, with a similar business model. I think that one unwanted consequence will be that as more computer installations gets centralized, it will concentrate and reveal the enormous power consumption of tech companies, the digital business models and the internet as a whole. Much like "big oil" and massive factories today have negative connotations, "big tech" might be perceived just as unfavorably, coming under scrutiny in its role in global warming and CO2 emissions.

  • New business models built around subscriptions, or even planned obsolence. It is imaginable that the hardware industry will try to confront the longer investment cycles by changing the business model, and convert to a subscription model, similar to what the software industry has done. We can already see today that the share of revenue from memory and solid state storage is increasing, which represent much more "perishable" products. Another indication is Intel's new Xeon Scalable CPUs that are certified for 10 years of usage ("T series"), but at a premium price. It shows Intel that has no intention of making regular CPUs with 10 years operational lifetime without charging extra for it.

  • IT costs will rise overall. Previously, businesses have had a free lunch when it comes to IT costs. You could reinvest at the same level and continuously gain processing and storage capability, which could support a growing revenue base. Post Moore's law, however, IT costs will return to scaling linearly with the business size. This trend is further exacerbated by the increasing digitalization, which drives companies to adopt business models which are more heavily reliant on IT technology as a revenue generator. For example, even today, 15-25% of a bank's total costs can go towards IT. Imagine being in the seat of that CIO, going from free yearly cost reductions to asking for budget increases to support the business goals. I anticipate an eventual backlash in the digitalization drive and a refocus on cost reductions.

  • Increased focus on software efficiency: software will have to get faster/more efficient, it will be the most fruitful way of mitigating rising IT costs. Gone will be the days of Python/Javascript and dynamic programming languages, and old-school low-level programming will return. Research in new algorithms will explode. This will spark a revolution in software development and likely be quite traumatic for the software development community. Perhaps so traumatic that AI technology will be enlisted to help rewriting programs from high-level languages to low-level and optimize code. Will it finally usher in the era of self-programming computers?

  • More widespread use of FPGAs and ASICs as regular CPUs only get marginally faster. Possibly, this will happen in combination with more vertical product integration. In order to capture more of the value, software may be bundled with FPGA hardware (or at least the FPGA IP which you can license together with the software). Only big players can afford to get tailored solutions like this, and for very specific processing running at large scale, which will further drive standardization and commoditization. Intel has anticipated this development, and purchased Altera in 2015 for 16.7B USD. The purchase was motivated by a belief that by 2020, FPGAs can be 33% of Intel's data center revenue.

In summary, even though I expect a slower speed of hardware development in the coming years, I do not expect it to translate into a slower paced business environment. The change from exponential growth to incremental growth may actually be perceived as more dramatic and disruptive on a societal level than a scenario with a continued status quo.