Skip to main content
Tech

Tech is improving at an exponential rate. Here’s what that means for society

Azeem Azhar, tech analyst, entrepreneur, and author, walks us through his new book on our exponential age.
article cover

Getty Images

6 min read

At this point, it’s trite to observe that technology moves fast; it's impossible not to notice it. Few people had a smartphone 15 years ago, and now nearly 80% of the world does. TikTok hit 1 billion monthly users after just five years of operations. Technologies that once seemed mythical—AI, electric vehicles, gene editing—are improving on a near-daily basis.

But in his recent book, The Exponential Age: How Accelerating Technology Is Transforming Business, Politics and Society, tech analyst, investor, and entrepreneur Azeem Azhar puts a finer point on this tried-and-true observation.

Not only are technologies advancing at breakneck speed, but they’re doing so at an exponential rate. Azhar pinpoints four general-purpose technologies—computing, renewables, additive manufacturing, and biotech—that he argues will define the decades to come.

The problem? Azhar argues this rate of technological change is leaving the institutions of yore—regulators and lawmakers, mostly—in the dust, and leaving society dizzy. He calls the gulf between the rate of technological change and regulatory capacity an “exponential gap,” and argues we need to close these gaps in order for these tech advancements to benefit all of society.

We caught up with Azhar to learn more about how he developed this framework for thinking through technological change, and what happens if we don’t close these so-called exponential gaps in the coming years.

This conversation has been edited for length and clarity.

There’s an economic principle that takes center stage in your book, called Wright’s Law. Can you walk us through what it is, and why it felt worthy of that focus?

Most people are familiar with the idea of Moore’s Law. And Moore’s Law is best described, by one of the academics I read, as a “social fact.” [Editor’s note: This description comes from science and tech historian Cyrus C.M. Mody’s 2017 book The Long Arm of Moore’s Law] Moore’s Law says that every couple of years, we can miniaturize things enough to make chips more dense, and therefore faster for the same price.

And the thing that struck me when I was in my late teens, early 20s, was, Wait a second, what if everybody went on strike for two years? Would the chips get faster? And of course they wouldn’t.

As I did my research, I came across this idea of Wright’s Law, which essentially says that we learn by doing, and we get better and better with every nth iteration. You double the amount of production, and the per-unit cost declines because you get smarter at making them. And that’s completely independent of scale effect, of economies of scale.

It turns out that Wright’s Law can be applied as a predictor of technological progress across many, many different technologies.

And I think we’re familiar with that: If we were in lockdown and we started to bake, as I and many other people did, the first type of bread you made tastes horrible and has loads of wastage compared to the sixth or eighth, right?

So there’s an academic basis to say this is a better predictor. But something else that I like about Wright’s Law is that it is about learning, and it connects us back to the sort of intimate thing that humans are good at, which is that we're good as individual learners, we’re good as social learners, we’ve developed a culture that allows us to have intergenerational collective intelligence.

When reading the book, I was struck by the fact that exponential gaps can go in either direction: In the US, for instance, you’ve got the FAA, which some would argue is moving too slowly and holding back a space like drone delivery as a result, and on the other hand, AI is relatively unencumbered by law. What do you make of that?

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

That’s a really, really great framing of the question. The real issue is, I think, less about hand-to-hand fighting over a particular issue. And one of the things I wanted to do in the back half of the book was try to sort of identify commonalities that would allow us to see that these things are about a systemic change. It’s quite a hard thing to do, because systems are hard to hold in your head.

The main observation was really about what is the dialogue that needs to happen between and across society about what technologies we develop and in what direction? And what kind of risk are we willing to take at each point of the journey?

And I think there’s a number of complex issues at hand, some of which I talked about in the book, for example, that policymakers and regulators can be very, very slow in their movements, and they can be using old precedents and frames and thinking ways of thinking.

But there are others that I don’t talk about so much, which really relate to the way that [regulators and policymakers] have often been denuded by the political culture. So, 50 years of political culture attacking anything that wasn’t a company—anything from the public-service sector—has a deleterious effect. And then the strength of the lobbying efforts and the connection of, essentially, needing to keep Mark Zuckerberg and Jeff Bezos happy in order to have innovation puts you in a really difficult position when you want to ask sensible questions about appropriateness.

If we change the dialogue around that a little bit, we can have our cake and eat it—we can have the innovation, we can have the engagement with society, we can have this idea of progress and this idea of things that are suitable for the communities that need them. But having a much more authentic conversation becomes important.

And what if we don’t close these gaps?

The tension emerges because of the exponential gap, right? This is not about saying that the technology is necessarily, and in all cases, bad. It’s really about closing the gap.

I think if you don’t close the gap, what you do is you create significant power imbalances in society, which will result in friction and volatility, whether that is friction and volatility because cyberattacks just increase, and we live in a world that we feel is unsafe, whether it’s friction and volatility because too much power accrues to large companies, and they have power in the market to determine employment standards, and what the shape of industries look like. And then that will often get expressed in the nature of the politics.

I used to call it the pitchfork risk...the idea that people will potentially rebel against that. And that just doesn’t feel like a very nice way to run a society, which is, “How much can we extract before there’s a rebellion.” That seems much less thoughtful and considerate than saying, “How do we run this in a progressive and fair and expansive, but also sustainable, manner?”

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.