mysqldump -u root -p MySQLpassword
minor edit needed
mysqldump -u root -p MySQLpassword
minor edit needed
I will have to politely disagree.
The space actually causes the password to not be recognized (at least in my experience). I had to do a ton of research on this very point. You should try it for yourself to see. On Ubuntu 16.04 and running ERPNext v10 the mariadb/mysql syntax requires there to be no space between the -p and the actual password.
Here is the quote from THIS one resource on the internet…
"In your command, you can’t have a space between -p and the password. Also, mysqldump has to be run on the command line, not in a mysql shell.
Try this on your command line
mysqldump -u username -ppassword databasename > backup.sql "
And here is yet another one calling out the correct syntax… Click Here
mysqldump –-user root –-password=myrootpassword db_test > db_test.sql
mysqldump –uroot –pmyrootpassword db_test > db_test.sql "
And still another example here… Click Here
"EDIT #1 : Your mysqldump command should now look like,
mysqldump -u root -proot --routines Data1 > Datafile.mysql "
So, Is it possible I am wrong here? Well, possibly. But the scripts I posted are the actual scripts I copied from my running server and posted in this thread. I only edited the actual passwords, usernames, and database file names for security purposes.
Hey, but if doing it with the space works for you, then by all means do it! It just also means that the way I have posted also works.
Everyone else I suggest you try it both ways if you have the time to test this, or just go with what I know already works.
Thank you for reading carefully and paying such close attention to the details. Even though I may disagree with the assessment, I couldn’t ask for a better compliment that to have my work checked and rechecked.
Thank you immensely!!
Er, um - you’re correct ! From the manual itself:
The password to use when connecting to the server. If you use the short option form (-p), you cannot have a space between the option and the password. If you omit the password value following the --password or -p option on the command line, mysql prompts for one.
Thanks. I would be just as okay with going the other way
2 weeks ago, one of my clients had their network hacked. All servers were infected with ransomware, which encrypted their files. Every. Single. File.
They would have been okay… except they didn’t have a good backup strategy. Almost all data was permanently lost.
It’s a crazy world. If you care about your data, implement backups. Test them. Simulate a disaster and then test again. Don’t wait, and don’t assume that nothing bad can happen.
“Hope” is not a good backup strategy.
For those that have a similar plan in place then you will understand this next bit of advice:
Go to your chosen cloud VPS vendor today and buy/lease one of the smallest KVM type VPS servers you can get (I just did this for $6/mo). Then have Ubuntu 18.04LTS installed on it and follow that up with a fresh install of ERPNext v10 using the easy install method for a production server. However, do NOT login to the ERPNext as Administrator. Leave everything as not configured. Then SSH into it and use the following command from ~/frappe-bench
sudo supervisorctl stop all
Once completed, use the Image Backup or Snapshot service of your provider to make and store a copy of this tiny server. If you are really cheap, you can probably cancel the server in two weeks and still keep the image backup file for future use.
Afterward, you will have a sort of insurance policy by having an unused but STABLE v10 edition of ERPNext. In the event you need to setup another client in a hurry, you can simply order another small server and have the service provider RESTORE the image to the new server and then expand the server up to what ever size you need it to be.
Judging on how rocky the startup of v10 was, this is just a good business thing to do until you have had time to test the production release of v11 for a few months.
Just a thought. Planning ahead is always a good idea whether for security reasons like disaster planning, or implementation reasons like having a known stable version of your software to spin up when you need it.
Remember… v11 will be the official release as of tomorrow, so now is the time to setup any fall back plan you might need.
I have been updating the instructions in the top of this thread today. So, if you have been starting to use the instructions please go back and re-read all of Step #7
The section has been updated to further instruct you on how to configure the incron utility to allow certain users to access it.
*** NOTE: Other updates may also be added over the next 24 hours as I continue to thoroughly test the instructions. I don’t want anyone to ever get stuck when trying to use them.
As I make further updates, I will come back here and add a another entry to notify you.
BTW… Here is an additional link to more detailed instructions for the incron utility
→ CLICK HERE ←
Updated the instructions again at the top of this thread to cover using the ‘scp’ command from a script in a more reliable method. You can find more details about the issue HERE.
Still refining these instructions but they really pretty complete I think.
Well, I found another few steps that I left out of the process in the first post. Fortunately the post editing window has not yet expired so I was able to fix it inline and in the correct order in the steps above.
The offending missing steps were related to making the script files executable and then setting the /bin directory as part of the path the interpreter will use to locate command scripts.
It’s fixed now. My apologies to anyone that may have been struggling to get those scripts working
Glad to see so much interest in keeping backups.
just curious whether there is any particular reason for not using the
bench backup command (which does basically the same in the background I believe). This would spare you the
this would create the backup in a certain location
~/sites/site1.local/private/backups but you could tweak the script to use that as the
current folder just alike I guess
Yup. When I originally tried to put the bench command in the the script and run it from a crontab entry, there were occasions where running it in background would time out and since it had been run from a script, I would not know that the backup had failed.
If the backup fails in the script, then everything else in the script that depends on the backup to complete would also fail. Ultimately that meant that I had no valid backups to use in the event of an emergency.
Over time, I found that using mysql commands from the command line showed no such behavior and I never missed a backup.
After this discovery I made a few additional discoveries. Running restore from the bench also had issues when the database approaches 1gb in size. There would be errors reported that didn’t make much sense. So, using the command line to do restores turned out to be a better and faster approach as well.
There just seems to be something about how bench handles the database backups and restores that is prone to problems.
That has been my experience. Your mileage may vary.
thx for clarifying.
How do you go about attachments/files? Is it sufficient to restore the files in the
~/frappe-bench/sites/[mysite]/public/files folders in order to have a restored ERPNext instance to find them?
Funny you should bring that up just now.
Up to this point I was just always taking the files at the end of he week and using scp manually to move any there were added to the other servers.
However, during an offline conversation with another user this very topic came up last week and I began working on another method of running the backups that will also include the /public/files and the /private/files directories.
It is turning out to be a complete revamp of the process using tar and bundling everything together in a single file again so that moving it across the internet remains as simple as the current method published.
Look for that in the next few days. I have been working on it since Saturday and I am only about 25% finished with it. I want to make sure it is fully tested before I lay it out for everyone else to use. It will likely be called “Step by Step guide to the Poor Man’s Backup System ver2”
I want to keep in in the same line of actions so that anyone can do it using 2 different servers and not have to spend a ton of money on maintaining it. I am currently using the first published method on all of my client servers and it has been wonderful. However, the maintenance of all of those servers every Sunday afternoon manually using ‘scp’ to move all of the files got to be time consuming.
In the new instructions, the files will also be included with every database backup so that you can truly have a complete backup system. It still does not allow for automated restoration of everything because that has it’s own set of issues that I have to figure out.
Keep wathcing this space for the update!!
nice. looking forward to this. If you put that script on github I may contribute if I a can (i.e. using some VARIABLES here and there, which you haven’t done thus far)
You know, that may be a good idea. I am not a programmer or developer and really do not know how to use the GitHub functions aside from reading the entries and posting bug reports on the repos that I follow.
I will make that a point to try to get to figuring out how to do that and have my own place for storing all of these helpful things. I tried to pull them all together in a single thread yesterday and the moderators deleted the entire thread last night. So maybe a github location would work.
I have no idea how to get information into or out of github at this point though because I have never had the time to learn how to use it.
Thanks for the wonderful idea. I will work on learning more about that after I get the backup instructions published here.
I am sure once you get started with git and a platform like github, bitbucket, …you won’t stop because it makes exactly such efforts much more fun an efficient. Here is one (of many) places to get started
And you are mistaken to think this is for developers. It’s for (collaboratively) working on content in a structured manner and even is very handy when you only cooperate with yourself. Can’t stress enough how helpful git can be.
Oh man now that you have said it I cannot get that out of my head. That really is a great idea.
I am not enough of a script writer to be good with variables so I will ultimately leave that up to you but it just created such a burn in me to get this all figured out now that I can’t get it out of my head.
Sometimes there are things that happen that spark me into a higher level of action. Usually it is the wisdom of people like @clarkej , @brian_pond and several others. Today it was your idea.
Thanks for the supportive ideas.
if you create a repo, I’ll provide the variables … promised.
or let me even take this a little further … send me your github id (in PM if you want). I’ll create a basic repository for this backup tutorial and transfer the ownership to you. Then there are not much excuses left to not start with git. I am sure this will even spark you more then the VARIABLES suggestion.
LOL… Very cool. Thanks
I am working away at the revision for the Poor Mans Backup v2 and have finally got over some of the major hurdles. Been working on it steady since this morning. By sometime late tonight I should have it working on one of my test servers to verify everything.
If I am unable to figure out the github thing, then I will certainly take you up on the offer tomorrow. I still want to try myself first.
So, I think I have worked out how to use GitHub. I created a place for the files once I get everything working on my test servers. Getting the new additions to the backup system to work has been a struggle. I keep seeing the script run past commands before they finish and only partial files get copied, etc. Still working on that part. Using ‘tar’ seems to take longer than using gzip, but gzip doesn’t handle multiple files so when executing tar commands the script blasts past the command once it is started and executes the next command before the tar function finishes.
I will get it figured out eventually. Once I do it will get published. I will post a thread here, but on GitHub I will have the updateable version of the text doc as well as separate files for the bash scripts. That should make it easier. Oh yeah, the repo is called ERPNext_Guides
More to be added once I get past some of the new script errors.