Thoughts on running a node and indexer using a Raspberry Pi 4 8GB RAM and 4TB HDD

I was inspired by this guide explaining how to run a participation node on a pi.

In addition to the node, I want to install the indexer. The Indexer will be taking up the majority of space on my hardrive.

As of the end of July 2021, storing all the raw blocks in MainNet is about 609 GB and the PostgreSQL database of transactions and accounts is about 495 GB. Much of that size difference is the Indexer ignoring cryptographic signature data; relying on algod to validate blocks. Dropping that, the Indexer can focus on the ‘what happened’ details of transactions and accounts. - Indexer Repo

Indexer Storage Requirements

Raw blocks in MainNet: ~700 GB
PostgreSQL database of transactions: ~500 GB

I figure a 2 TB portable SSD can handle running the indexer along with the node, docker, postgres etc.

Performance Concerns

I understand that it would take a month for my a full archival node to catch up, which isn’t a problem for my use case. However, once it is caught up, I do need the indexer’s data to be close to real-time.

Would a Raspberry Pi 4 8GB RAm ( with a Broadcom BCM2711, Quad core Cortex-A72 (ARM v8) 64-bit SoC @ 1.5GHz) be able to write to the indexer’s database with new transactions quick enough?

Thank you

2TB of SSD will most likely quickly not be enough.
See the current sizes of the databases there:

Thanks for sharing @fabrice. Alternatively I could use a 4TB HDD. A 4 bay disk station would allow me to add memory as the storage requirements continue to rise.

Is this a viable option, or would read/write speed become an issue?

HDD will definitely not work for the mainnet data folder.
It may or may not work for the indexer but I would not recommend it.