Performance Issues (large data)

Hello All,

We have been using ERPnext’s Purchase and Selling module since last 2 months. Recently we’ve had a plan to scale the usage to multiple cities. For this, we created new customers and a lot of warehouses.

As a result of this, now on submitting purchase receipt or delivery note, transaction is taking a lot of time and its getting timed out (time out set to 120 seconds). During this 120 seconds, all other transactions (including browsing) are stuck. As a result of which all users have to wait for 120 seconds.

This is causing a serious damage to our business. For the last 2 days, almost all the erp related transactions are stuck.

Any ideas on how to get this solved ?

@rmehta @umair @Gitika_parashar

Note: We are already into reading the code and figuring out what the issue is. No success yet.

1 Like

Some suggestions

  1. Start logging slow queries
  2. Check your innodb_buffer_pool_size (keep it to 60% of your RAM) - make sure you have enough RAM!
  3. Check your RAM usage
  4. Check Redis cache size - keep it to 10% of the RAM

Try and posting the slow queries and we can figure how to speed things up.

Edit: Having multiple cores would also be important if you have lots of parallel entries.


Logs generated on hitting submit of a purchase receipt record. Timeout is set for 2 minutes.

Frappe Log File

There is a method update_gl_entries_after being called inside make_gl_entries written in StockController.

In this method, future stock vouchers are fetched and corresponding GL entries are fetched for the given items and warehouses. Now these expected GL entries are tallied with already existing GL entries, if there is a mismatch, then all gl entries of the voucher are deleted and recreated. This process was happening in a loop.

This was the problem why the server was getting stuck for 2 minutes (specified timeout).

Though I am not sure why there was a repetitive mismatch in gl entries.

Anyone who faced this problem before.

I have same problem but regarding purchase module. It takes too time to load. I am worried if my data increases i dont want to lose data as well as the performance.

@neeraj_yadav are you found the solution yet?

@rmehta, I think it would be better to use hash value instead of naming series in tabGl Entry because for each gl entry system creates row level lock in tabSeries table.

@sanjay you are right. We have already shifted to hash key as autoname for gl entry and stock entry tables in our setup.

Please read Frappe - Performance - Naming Series

Also read [Performance] 1 Sales Invoice Save makes ~700+ DB calls. ~1000+ DB calls on Submit

For other optimisation done recently

Are we going to have this in V10 as well?

Naming series changes are still not available in develop branch. We have done it in our fork for time being. We need to still nail the right approach.

Other optimisations related to get_doc and gt_value will be available in version 11. Not sure if they can be merged in 10 as well. @nabinhait can comment.

How much purchase did you get from hashing gl names?

@rmehta that was done to mainly support concurrent transactions. Haven’t measured the gain yet. Will post once done.