Hi Team,
Trust everyone’s doing great. I believe the Data Import process in ERPNext can still do better in handling large sets of data. I understand that restrictions are put in place in order not to overload the server and not to affect parallel users but it’s still quite limited
For example, I ran a test by trying to upload opening inventory of 10,000 serialized items. I had to break the data into batches of 500 serials per Stock Entry before they could be submitted. After 10 Stock Entries (5000 Serials), the system stopped submitting and I had to scale down to 250 serials per Stock Entry. This is quite cumbersome especially when you’re dealing with large sets of data (50,000 records or more). This mostly happens with opening entries
I’ve seen suggestions about scripting when dealing with such issues but I don’t think it’s an ideal solution for most users who know very little programming and just need to get on board the ERPNext train with as little hassle as possible. A front-end approach is usually more efficient and almost always leads to better and cleaner databases
Also, when the system can’t submit an entry due to the large number of serials, it doesn’t return an error message. Instead, it stays stuck on ‘Submitting’ which leads the user to believe something is still going on in the background. If there’s a timeout, the system should return some kind of error message
Please help look into these
Thanks plenty