I'm trying to find-out a way how to prevent performance degradation in my PostgreSQL database.
The table is updated every minute with 100,000 - 200,000 INSERTs. Every 5 minutes the table is updated with the same scale of aggregated UPDATE and a number of DELETEs. Aggregations and DELETEs are done in a way to keep the size of the table around 10 million records.
Apparently updates are too frequent so AUTO VACUUM cannot be completed. I started to run VACUUM after each DELETE series. That improved behavior a lot.
I've a suspicion that periodic reindexing would help but haven't verified it yet. Actually it's very suspicious that PostgreSQL has a dedicated command for reindexing.
I have played with different fillfactor for the indexes, with no visible difference.
What else should I consider to make keep performance stable? Let's say that slow but stable over time is better than sometimes fast and sometimes very slow.
UPD: vacuum analyzed after each delete + reindex every 6 hours solved the issue.
AUTO VACUUMbehaviour, so if you want better answers for your scenario, it's good to answer first the comment/questions.