Why Jevons Paradox and Moore’s Law Prove We’re Always Underestimating Technology
There’s something hilarious about looking back at past predictions about technology. The one that always gets me? That infamous (possibly misattributed) Bill Gates quote: "No one will ever need more than 64K of memory." Oh, Bill. How adorable. Today, I have individual memes that take up more storage than that.
But this isn’t just about Gates and his 64K moment - this is a broader pattern. Humans, despite all our brilliance, are spectacularly bad at predicting how much of a good thing we’ll use once it becomes more efficient. Enter: Jevons Paradox.
Jevons Paradox: The "Oh, We’ll Just Use More of It" Effect
Back in the 19th century, a British economist, William Stanley Jevons, noticed something weird. As steam engines became more efficient at burning coal, people didn’t use less coal. They used more. The cheaper and more efficient something becomes, the more people lean into it.
Fast forward to today, and AI is proving Jevons right again. The cost of AI models is dropping, their intelligence is skyrocketing, and guess what? Instead of slowing down, we’re cramming AI into every possible application. From writing our emails to generating cat pictures, we just keep using more AI.
Moore’s Law: The "Tech Gets Faster, and We Want Even More" Effect
Now, let's talk about Moore’s Law. In 1965, Gordon Moore predicted that the number of transistors on a chip would double every couple of years, making computers exponentially faster and cheaper. He was right - for decades.
This is why your phone today is more powerful than the computer that sent astronauts to the moon. And yet, somehow, it still feels sluggish when you have 37 Chrome tabs open. Why? Because every time computing power increases, we find new, more demanding applications. We’re like kids who get a bigger toy box and immediately demand even bigger toys.
Jevons vs. Moore: The Ultimate Tech Showdown
- Moore’s Law predicts that technology will keep getting exponentially better.
- Jevons Paradox predicts that as it does, we’ll use even more of it than we ever thought we would.
The result? A never-ending cycle where efficiency fuels demand, and demand fuels even more advancements. It’s why “no one needs more than 64K” turned into “no one needs more than GPT-5” which will, inevitably, turn into “no one needs more than GPT-10” before we all start casually chatting with AI versions of ourselves in the metaverse.
The Lesson: Never Underestimate Our Ability to Want More
So, what can we learn from all this? First, any time someone says, “We’ll never need more than [X],” get ready to laugh at them in about a decade. Second, AI and computing aren’t slowing down—they’re accelerating. And if history has taught us anything, it’s that efficiency doesn’t reduce demand. It makes us want way more of the thing.Which brings me to the final question: how long before we start saying, “Nobody will ever need more than AI that can run the entire world economy in real time”?
I’d give it about five years. Maybe less if Moore and Jevons have anything to say about it.
No comments:
Post a Comment