Reasons for the Max Supply of 10 billion Algo?

Hello,

As a first time poster I wanted to start by saying how impressed I am with Algorand and it’s proposed solution to the blockchain trilemma.

However, since I wasn’t able to find this information on my own, I also wanted to ask what was the though process and reasoning behind the decision to limit the max supply of Algo to 10 billion (or to any fixed number for that matter)?

Or was it due to some technical limitation of the Algorand protocol, that doesn’t allow unlimited supply?

As someone who experimented with trading Bitcoin ~10 years ago (when it’s price was less than $5/BTC), I always thought that Bitcoin’s biggest flaw (economic, not technical) was it’s intentionally deflationary design with the number of bitcoins capped at 21 million. If intended to be a currency of the future that people use for actual economic activity (rather than just for investment or short term speculation), this is the wrong way to go in my opinion.

Some other competing cryptocurrency/smart contract protocols, such as Ethereum for example, do not cap the maximum number of Ether that can exist.

So, while I prefer Algorand for it’s more advanced mathematical and technical solutions compared to other cryptocurrencies, I’m also betting that cryptocurrencies with inflationary (rather than deflationary) designs will prevail in the long run.

Is there any way to reconcile these two? A protocol with Algorand’s Pure Proof of Stake combined with an inflationary economic design?

Best,

Tebok

EDIT: Fixed typos.

1 Like

Tebok,

The 10 billion number is not set in stone. On the Algorand blockchain, a protocol upgrade can increase that amount, or introduce a mechanism to “mint” additional coins.

I’m sure that you’re well aware of reported individuals that lost their bitcoin account keys… which makes these fund “burned”. That by itself makes any pre-minted blockchain deflationary.

On the Algorand blockchain implementation, Algos are using uint64, which has a range of 0…2^64-1. Each Algo contains 1 million uAlgos, which takes ~20 bits. At this time, only the first 54 bits are used ( i.e. 1B * 1M ). So, from technical perspective, there are 10 extra bits for growth… which leaves lots of room for future decision making.

Note that at the current time, it won’t make any sense, since only a (small) portion of the Algos released to the market. The Algorand Foundation have a clear roadmap, so I wouldn’t be expecting to see any change before that… but I could be wrong.

4 Likes

Following what @tsachi said:

I’m sure that you’re well aware of reported individuals that lost their bitcoin account keys… which makes these fund “burned”. That by itself makes any pre-minted blockchain deflationary.

I would like to add some pure personal speculative thoughts on the general question about: what are the implication of having a fixed total supply within a Proof of Stake consensus, which is very different from having a fix supply in a Proof of Work, whose consensus mechanism is completely orthogonal to users stake.

In my opinion, for fixed supply Proof of Stake blockchains, as Algorand, I find that a sort of “ecosystem usucaption” managed on consensus layer may have some sense.

Let me explain: considering that the supply is fixed, and therefore this already makes the asset intrinsically deflationary, if we take into account that on a centuries time scale thousands of private keys could be lost, there will end up being an unnecessarily immobilized supply that at a certain point may become an “opportunity cost” for the entire ecosystem in the sense that, if it were available for economic activity, the entire ecosystem would benefit.

I argue that being a PoS with fixed total supply, if the technology has to be robust even on the scale of the centuries (predicting that from here to there many stakes will be stuck due to lost private keys), that stakes on dead public keys, with 0 transactions in 100 years, are a burden on the ecosystem as a whole.

So, in my opinion, rather than having an “uncapped supply”, it would make more sense that stakes immobilized on public keys that have no economic activity in the ecosystem for more than a century should be able to end up in the Rewards pool and re-enter circulation according to the best methods provided for maintaining the liveness of the ecosystem. Otherwise, on a centuries time scale, the risk is the “thermal death of the universe”, when entropy is zero and everything is immobile.

Those stakes could be put back into circulation according to the mechanisms that encourage the most virtuous behaviours for the whole ecosystem (other nodes, other decentralization, other votes, etc.) and in the end have a positive balance compared to the only role they could have as an additional deflationary thrust on an asset that is already deflationary by design.

Thank you sir, for the very helpful reply. The technical details you shared are enough to convince me.

I had read the Algorand Foundation’s roadmap too, and my original post was inspired by the following statement by the Foundation (emphasis is theirs):

  • At genesis of the Algorand blockchain, 10Bn Algo was minted and this 10Bn Algo represents the fixed and immutable maximum supply of the Algo.

I was also worried when it mentioned that the original distribution schedule was going to distribute all 10 billion Algo by 2024, which even after being changed to 2030, seemed like a way too fast of a schedule, if that had indeed meant that 100% of the Algo that could ever exist would be distributed by then.

To further clarify my concern in as simplified way as I can (that’s not specific to Algorand btw): If a vast majority of a specific cryptocurrency is put in the hands of a very few early adopters, and the said crypto were to see a mass adoption after most of the distribution schedule had run out (driving up the price in the process), it could create excessive wealth inequality between the early adopters and everyone else, which is not a healthy ecosystem to have.

In fact, I would argue that cryptocurrencies with designs that don’t consider the scenario above (or worse, ones that were intentionally designed to be that way), are probably not the ones that will see the mass adoption when the society is ready for mass adopted cryptocurrencies in general.

But of course, if the maximum number of Algo is not set in stone as you said, I will trust that Algorand can adapt to future conditions as needed, and I will certainly continue to be an Algorand advocate based on the technical details alone.

EDIT: I just realized I called you a “sir” without having any knowledge of gender. How thoughtless of me to make such an assumption…

It’s all software, so it’s not set in stone in the same way that Bitcoin’s 21 million (well, actually 2,100,000,000,000,000 satoshis) isn’t set in stone. It ‘could’ be changed, but would it or will it - highly unlikely. It would be a very significant forking event and would invalidate the entire purchasing/pricing/use model for the vast majority of its holders.
Algorand changing this would be a similar event in my mind.

2 Likes

It would be interesting to hear the Algorand Foundation’s comment on this too, because if in their minds it is already “set in stone” (and unlikely to be changed before or after 2030), then we’re back to my original question on, or a quest for, a cryptocurrency protocol with Algorand’s Pure Proof of Stake combined with inflationary economics…