How We Get to Computing Abundance

Peter Diamandis
6 min readOct 4, 2021

What if computing were unlimited… as in infinite? What type of problems could we solve? What could we simulate, model, and calculate?

For all intents and purposes, our computing is approaching such a capacity. Indeed, Moore’s Law (the observation that computing capacity doubles every 2 years) has continued unbroken for just over 6 decades.

Moore’s Law is just one of many exponential improvements in computing that continue to accelerate.

Another one is the quantum computing revolution, which is following an exponential improvement curve of its own. In fact, quantum computing may well be the backbone of computation for the next 6 decades of exponential growth.

In this blog, I’m going to discuss some of the recent innovations in advanced computation and the possibilities and opportunities ahead. Tracking the latest developments in advanced computation is a key focus of my year-round Mastermind and Executive Program Abundance360.

Let’s dive in.


Back in August 2013, Carl Bass, then-CEO of the software and design giant Autodesk, gave me a tour of his then-newly constructed Pier 9 center: a maker’s paradise located in San Francisco and loaded with 3D printing equipment, machine shop tools, design stations, and much more.

The center is powered by “infinite computing,” a term Bass used to describe the ongoing progression of computing from a scarce and expensive resource toward one that is plentiful and free.

For more than 60 years, the exponential growth of computing power has continued non-stop.

Just three or four decades ago, if you wanted to access a thousand core processors, you would need to be the Chairman of MIT’s Computer Science Department or the US Secretary of Defense.

Today, the average chip in your smartphone can perform billions of calculations per second.

These continuing advances make clear just how backwards our old thinking about computing is.

As Bass put it: “Historically, we used to think of computing as this scarce resource… today it is abundant, heading towards a new era of overabundance.”

In the old computing-scarcity way of doing design, humans used computing to calculate very specific, restricted scenarios. This is effectively thinking “in the box,” with our thinking limited by computing scarcity.

“But in the new exponential era,” explains Bass, “powered by near infinite computing, we can now simultaneously apply 10,000 CPUs to the same problem, exploring thousands of possible solutions, solving and optimizing in seconds.”

So, what does all this additional computing power buy you?

For starters, an entirely new way to approach innovation. And the ability to train human-surpassing AI models.


There is little question that the field of AI has seen massive progress in the past 5 years.

All one needs to look at are the incredible achievements of Deepmind’s AlphaZero and AlphaFold, and OpenAI’s GPT-3 to clearly understand that something remarkable is happening.

So, why are we seeing this extraordinary progress in the field of AI now versus 50 years ago when the field of AI was first discussed?

One reason is the massive amount of data now available.

The second reason is the sheer amount of computational power now accessible on clouds powered by Google, IBM, Microsoft, Amazon, Facebook and OpenAI. The amount of near-infinite computing these AI companies can throw at a problem is staggering.

For example, to train GPT-3 (one of the largest and most advanced language models to date), the OpenAI team required 3.14 X 10²³ floating point operations.

This would take roughly 355 years to do on a single NVIDIA Tesla V100 GPU — the highest performing GPU ever built.

That is the level of computing needed to train AIs that can generate code, write stories, animate, and even commentate cricket matches.

During my conversation with Bass, as a way of illustrating brute force further, he told me a story about an electric go-kart he was building with his son:

“In the near future, with infinite computing, I could ask the cloud to run design simulations, experimenting with every possible location for the motor and a range of different materials and thicknesses, resulting in not just an adequate design, but the best design.”

Now couple this with advanced AI models trained on infinite computing. One can imagine a Jarvis-like software where you simply speak to the software and designs are developed and optimized in real-time.

In the future, Bass may only need to say: “Jarvis, show me 3 optimal designs for an electric go-kart sized for children.”

The rest will be carried out by GPT-3 (and eventually GPT-4), simulation software, and infinite computing.

But what happens when the way we compute itself fundamentally changes?


Quantum computing puts this whole paradigm on turbocharge.

It uses the inherent properties of the quantum world to radically speed up certain types of computation.

At the quantum scale, particles can often be in two “states” at once, and anywhere in between, a concept called quantum superposition.

As a result, computers that utilize this fact can often explore the space of solutions to a problem in an exponentially faster fashion.

Many classes of problems, such as path optimization, network analysis, molecular simulation, and, of course, machine learning optimization, can be significantly sped up with quantum computing.

For example, in 2019 Google used a 53-qubit computer to solve a very narrow (unique) problem in 200 seconds that would’ve taken our most powerful supercomputers 10,000 years to solve.

That’s the power of quantum computation.

And recent developments are rapidly pushing quantum computing from high-tech labs into consumer labs. For instance, Quantum Brilliance is developing a synthetic diamond-based room temperature quantum computer that’s the size of a lunchbox. (Most quantum computers must be kept at near absolute zero and are the size of server racks.)

With continued advances in quantum computing, we’ll see an explosion of information-processing capability.

For example, IBM expects to reach 1,121 qubits with its Condor processor by 2023. This would be a 17,000 percent increase from today. IBM sees 2023 as the inflection point for the commercialization of quantum technology.

And once quantum computing merges with the infinite computing and machine learning infrastructure developed in the last decade, the sky is truly the limit.


Infinite computing unlocks endless creative possibilities for entrepreneurs.

Rather than needing to build the computation infrastructure from scratch, you can solve problems directly using the cloud.

Infinite computation has already paid dividends in one especially important class of problems: training AIs. And those AIs are now as smart as humans in several domains.

Imagine the world we can build when computation becomes quantum, AIs are the new interface to the computer, and computation is endless.

Infinite computing demonetizes error-making and democratizes experimentation. No longer will we have to immediately dismiss crazy ideas for the waste of time and resources they invariably incur.

Soon we’ll be able to try them all.

This is the world of abundant computation.

If you’re an entrepreneur excited about the developments we’ve made so far with advanced computation, there’s no better time to get involved.


Would you like to learn about and leverage the latest developments in computation to transform your business?

Then consider joining my year-round Abundance360 Mastermind and Executive Program and come to our in-person A360 Summit February 2–4, 2022.

My mission is to help A360 members obtain mastery in four specific mindsets: an Abundance Mindset; an Exponential Mindset; a Longevity Mindset; and a Moonshot Mindset. Together we will actively select and reinforce your preferred Mindsets.

To learn more and apply to A360, visit



Peter Diamandis

Passionate about innovation and creating a world of Abundance. Companies: XPRIZE, Singularity University, Planetary Resources, Human Longevity Inc.