Handling Huge Incoming Traffic


We are on V12. We’re dealing with high load of traffic on daily basis, around hundreds of requests per second. It’s webhook from 3rd party sites to write Sales Order etc into erpnext instances

The system slow down, timeout, error…if we switch ON webhook and back to normal if we switch it off

8vCPU and 16GB RAM, upgrade to 32GB ram but doesn’t help, also using background worker to process incoming requests…
After server is upgraded, RAM still has a lot of space, but system is hanging due to incoming traffic loads

Any could give some ideas to config nginx or ??


It seems you eliminated RAM being an issue so it is either:

  • A storage bottleneck: with the high number of requests you end up having too many read/write requests to storage. May I ask what type of storage you have: SSD or HDD?
  • CPU related: 8vCPU is a good allocation but still has its limits. The limit is on concurrent requests per second. Each core can only handle one request in any moment in time. So you can only handle 8 requests in the same instance per second theoretically; probably even less practically. You first need to find out how many requests you expect to receive per second and design accordingly.

It may be advisable to segregate your database using a hosted database solution so your server is dedicated to handling web requests and running backend.

Redis caching helps alot so make sure you have that enabled and configured properly.

May I ask what is your typical RAM and CPU usage %, on idle and in peak time?

I’m using SSD, 16Gb RAM and 8vCPU on digital ocean. Upgraded to 32Gb of RAM but not so much help. system is still slowing down even there’s still lot of available RAM

My config : 2 gunicorn worker, 5 background worker

I’m expecting to serve 1K of incoming requests per second smoothly. Any recommendations for ideal configs?

1000 Sales Orders per second is a lot. You could do some profiling to locate bottlenecks, but I suspect you’ll find dozens of different pain points that choke deployment at this kind of scale. To make this work, you’ll almost certainly need to write your own interfaces with direct sql access.

I concur with @peterg. That’s a challenging requirement. We (this forum) could provide various recommendations and ideas. But they would just be guesses. While some suggestions might help, I doubt they’ll get you to 1k requests/second.

This would require a lot of investigation, logging and analysis on the server, and (almost-certainly): custom development work.

Did you see this guide?

I had a similar problem, 40 people working at the same time in different ERpnext modules and 6 clients entering data from the web every 15 minutes, when I did the steps in that guide, the performance improved a lot.

i have 4vcpu and 16gb of ram and i have 4 background workers and 9 gunicorn workers