Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 2 of 2
  1. #1
    Senior Coder o0O0o.o0O0o's Avatar
    Join Date
    Jan 2008
    Location
    C:\Windows\System32
    Posts
    1,018
    Thanks
    19
    Thanked 9 Times in 9 Posts

    databse backup automatically

    hi ,


    Is it possible to save last three backups of mysql dtatabse on server automatically.

    i want that i should have database backup for last three days so that i can revert back if problem occurs, because when usually when we know some problem has occured we are 2,3 days late .

    Any idea

  • #2
    God Emperor Fou-Lu's Avatar
    Join Date
    Sep 2002
    Location
    Saskatoon, Saskatchewan
    Posts
    16,978
    Thanks
    4
    Thanked 2,659 Times in 2,628 Posts
    Crontabs and a tracking table / tracking records.
    Consider how on OS chooses how to do incremental backups. Make a similar structure to you information, either choose to backup records on a per table changed basis, or on a record changed basis. If you choose the table method, create a new table that tracks if a table needs archiving - and add to your code the archive bit so when anything changes this is set to 1. For records, add an additional attribute that will track which records require updating in a similar fashion.
    Then write a backup script (SQL has dump commands for it if you just want to do all), and have either a real cron do the work for you, or use a 'poor mans' cron if you don't have cron available (that is, a table tracking last updates and determining on a page load if backups are required). I don't recommend 'poor mans' methods for archiving.
    No matter which way you do it, you will need to create a way of determining how and what data requires backing up. For specific days in the past, you will be responsible for creating a way to track creation date, and that may eat at your resources quite quickly.

    To be quite blunt, its probably a lot more simplistic to just have a cron to execute a database dump script on all data than on specific data. The downside is that if you have a lot of data, this will take a lot of time. This would be a lot easier if you have access to the server itself (either physical or remotely) and can loggin. If you can do that, then this task is 100x easier as you can command line it.
    With the command line in mention as well, its possible that this is an easier task to preform in Perl than php. Maybe check with the perl guys as well to see if they think it would be easier. This assumes a Unix or linux based hosting service as well.
    PHP Code:
    header('HTTP/1.1 420 Enhance Your Calm'); 


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •