How to optimize Postgresql max_connections and node-postgres connection pool?

In order to support more then 5k requests, while maintaining the same response rate, you'll need better hardware...

The simple math states that: 5000 requests*190ms avg = 950k ms divided into 16 cores ~ 60k ms per core which basically means your system was highly loaded.
(I'm guessing you had some spare CPU as some time was lost on networking)

Now, the really interesting part in your question comes from the scale up attempt: m4.10xlarge (160 GB mem, 40 vCPUs).
The drop in CPU utilization indicates that the scale up freed DB time resources - So you need to push more requests!
2 suggestions:

  • Try increasing the connection pool to max: 70 and look at the network traffic (depending on the amount of data you might be hogging the network)
  • also, are your requests to the DB a-sync from the application side? make sure your app can actually push more requests.

The best way is to make use of a separate Pool for each API call, based on the call's priority:

const highPriority = new Pool({max: 20}); // for high-priority API calls
const lowPriority = new Pool({max: 5}); // for low-priority API calls

Then you just use the right pool for each of the API calls, for optimum service/connection availability.