Excel sheet datas import using data import tool via api
Hi,
You can write a python program to read the excel file and insert the record in Doctype using the API.
Below is the AI generated code for the sales order:
import pandas as pd
import requests
def read_excel_file(file_path):
# Read the Excel file
df = pd.read_excel(file_path)
return df
def upload_sales_order(data):
# API endpoint for uploading sales orders in ERPNext
url = "https://your-erpnext-instance/api/resource/Sales Order"
# Headers for API request
headers = {
"Authorization": "Bearer your_api_key",
"Content-Type": "application/json"
}
# Iterate through each row in the DataFrame and upload as a sales order
for index, row in data.iterrows():
payload = {
"doctype": "Sales Order",
"customer": row['Customer'], # Assuming 'Customer' is a column in your Excel file
# Add more fields as per your Excel file columns
# Example: "date": row['Date'],
# "total_amount": row['Total Amount']
}
# Make POST request to ERPNext API
response = requests.post(url, json=payload, headers=headers)
# Check response status
if response.status_code == 200:
print("Sales order uploaded successfully.")
else:
print("Error uploading sales order. Status code:", response.status_code)
def main():
# Path to the Excel file containing sales order data
excel_file_path = "path/to/your/excel_file.xlsx"
# Read Excel file
sales_order_data = read_excel_file(excel_file_path)
# Upload sales orders via API
upload_sales_order(sales_order_data)
if __name__ == "__main__":
main()
Thanks,
Divyesh Mangroliya
2 Likes
Do you think this is the right way? what if there are 1000 sales order. system will crash if it bombarded with these much of API request in a fraction of second.
Hi,
Currently I am using System in Production to create Sales Invoice this way and creating three hundred Invoices in 1 Second smoothly.
If you want to handle more load system can handle, you need to increase the Gunicorn workers as per the load.
Thanks,
Divyesh M.