Questions about smart contract application

I have met a problem of maximum 10 smart contract apps per testing account when I am using testnet. The upper bound of 10 is too small for our project, and we need smart contract apps much more than this number.

I am wondering about the max number of smart contract app created by one algorand account in main net. I hope the answer is greater than 10.

TestNet and MainNet have exactly the same consensus parameters.
You can see the current parameters there:

These parameters can be changed by consensus upgrade if enough stake agrees to it.

Can you use multiple addresses in your application?

Note that you can easily create many addresses with the same secret key: use a 1-out-of-2 multisig with one key being the real public key and the second key being a random string that nobody knows the secret key. You can alternatively use rekeying for that purpose.

I’d like to help and provide a constructive advice, but I don’t have enough information to go by. Could you provide more information about what you’re trying to achieve, so that we could understand where the need is originating from ?
( i.e. would 11 application be enough ? )

Thanks for your reply.

First, could you please explain what is ‘multiple addresses’ in your solution? My understanding is the public addresses of the account holding the smart contract application. Is that right?

Besides, I cannot find out how to generate multiple public account address using one secret key. Could you please give more explanation on this topic?

Thank you for the help. I will briefly explain our case here.

We are using the global states of the smart contract application to record some useful information, where the information is provided by the users opted into the smart contract. Here, we may have thousands of users opted-in, but the max number of the smart contract app limits the capacity of our record.

I am wondering whether there are better ways to achieve our goal, or there is another option in smart contract to meet the requirement of large-scale information storage.

for 1 smart contract, you can have thousands of people opt-in and store local storage data on those users. You are limited to 16-kv pairs (64bytes) of data per user and this data gets stored in the ledger under the individual user’s account. The 10 limit is how many contracts one account can create. Here is a description of multisig accounts: Algorand Developer Docs
You can also rekey any account to use another private key: Algorand Developer Docs
Algorand Rekeying - YouTube

Would it be possible for you to store the user-specific data in the local storage ( i.e. per user ), and only have the common data stored in the global storage ? ( i.e. as @JasonW suggested ).

Also, I believe that a single app call can access more than a single account data at a time, so you could “copy” data items between them.

Thanks tsachi and JasonW for the help.

I think my current situation is I have some large common information to store in smart contract, and meanwhile I am implementing the searching functionality among the global states of smart contract apps. Therefore, storing everything into local storage is probably not the best solution.

I am eager to know your ideas on expanding the number of global states, or expanding the storage capacity of smart contract without referring to local storage?

@yfmao, the storage limits that have been imposed were designed to ensure high transaction throughput while providing flexible smart contract functionality.

With that, I can see several approaches that could be a good fit for you ( based on the information you’ve provided so far ) -

  • Avoid storing the large common information in a global state. Instead, store the hash of that information, while storing the actual information on a third-party storage. There is a notary-like service that is using Algorand in a similar fashion to that.
  • Deploy your own co-chain. On your own chain, you’ll have no issue re-configuring the above parameters to your needs. Having a private network is a bigger commitment, though.
  • Find a creative way to avoid getting into a large common information. I don’t have enough data around your application to guide you here. ;-(

As for increasing the number of global states, I’m unaware of any plan around that. But from what you’ve said, a minor increase ( like 16 -> 24 ) would not really be meaningful for you anyway. You are looking for 16 -> 32K, which is something that would definitely lower the transaction throughput.

Btw - given that TEAL doesn’t have any loops, how were you planning to iterate over keys ?

@tsachi Thanks for your reply. I have figured out this problem with your assistance.

I have another quick question on smart contract testing. I need around 100 accounts in my testing process for smart contract opted-in. Do you have any idea on how to create 100 accounts in a short time? Or, is it possible to get this number of testing accounts from you developers?

It is pretty trivial to use the JS SDK to generate any number of accounts in a hurry - this is some older code I used at one point:

const algosdk = require('algosdk');
const fs = require('fs');
const lineReader = require('line-reader');

for(let i = 0; i<30;i++) {
    var account = algosdk.generateAccount();
    console.log( account.addr );
    let row = account.addr;// + "," + algosdk.secretKeyToMnemonic(;
    fs.appendFile('new_accounts.log', row + '\n', (err) => {
        if (err) {

Thanks Tim for your reply. Your solution suits well for me in constructing the testing accounts.

I am facing a new problem sending transaction to purestake testnet. It always gives me the error message “Too many requests”. Although my model is very slow in sending requests (around 5 - 7 requests sending to server per minute) where I add lots of time.sleep between two requests, there are still appearing errors of request congestion.

Any possible solution for this kind of error?

A free account has 5 requests per second from Algod and 1 request per second from Indexer - and it resets each second. There are some daily caps, can you dm me the email you used to register and I’ll look at the account and logs.

Thanks Tim for your reply.

I have a quick question about account capacity. Here I am using test net account which has a limited requesting capacity per second. I am wondering if I choose mainnet, will the account requesting capacity be better than the testnet account?

The throughput limits are the same for free accounts, mainnet is also capped on total requests per day.

See also:

Thanks @Tim and @fabrice.

I am interested in the Pro account and I would like to know the limitation of indexer request of Pro account. Is it having the indexer requesting capacity like 1 request per second in free accounts?

It’s 5 requests per second (tps) for Indexer under the Pro account

Thank you @Tim .

In my current situation, I have my possible users sending the indexing request to purestake server. But the limitation of 5 index per second is annoying.

I am thinking about giving each user a new API key in purestake, so that the pressure of every indexer will greatly decrease. I would like to know if purestake has the function of generating the temperaory API keys for users not registered in purestake. Or something that automatically register and send API key in purestake is also acceptable.

Also, is it possible to have an indexer with requesting limit more than 10 reqs per second?

I think that it would be beneficial if you’ll describe the endgame needs rather than the intermediate needs.

Allocating multiple API keys sounds like a way to work around a limitation that was placed on purpose.

Could you define what queries you would need and what the expected query rate ? You might find that your application would “justify” it’s own private indexer.