I’m importing ~28K items with mandatory fields only for ERPNext, and have divided the data into eight CSV files of maximum 4000 items each. I started importing the first csv file 30 minutes ago, and the data import tool still says “Performing hardcore import process…” (with moving progress bar).
Is this reasonable? The server is a Dell Optiplex 745, dual-core Intel with 4MB memory and Centos 7. I can open a separate browser session to ERPNext and all seems OK (but no stock items). What is a ballpark estimate for Items/second importing? Does anyone have any recommendations for debugging? (if the import is still “stuck” when I get back in the morning, I’ll kill ERPNext and try importing a 5 item test file).
May be 4000 is a bit large number, try importing around 500 at once.
Thanks, I’ll post results tomorrow.
Is there a recommended procedure to import a large number of items? Is it possible to load data directly into MariaDB?
Such large import is bound to fail because there’s a default http timeout of two minutes.
Never do that. It will not call controllers, hence it will not validate any incorrect data, and if there are calculations that will not be executed.
I don’t mind taking responsibility for importing correct data only, but I do understand bypassing business logic and complex model behavior may cause problems.
Best way to import a very large number of items is to use the API GitHub - frappe/frappe-client: Python library to use Frappe API
Well Dale, that depends on how complicated you make it. I have found that by limiting item fields to just a few beyond required, helps one get large numbers of items imported quickly. By keeping it simple one can go in and enter in things such as warehouse, item price, etc later on. For us the obstacle was just initially getting the items in.
So I understand your context, how many parts were you importing? 100? 1000?
For large data import. it’s best if you use the FrappeClient.