UpdraftPlus WordPress Backup Plugin - Version 0.9.20

Version Description

  • 12/06/2012 =
  • Updated to latest S3.php library with chunked uploading patch
  • Implemented chunked uploading on Amazon S3 - much bigger sites can now be backed up with S3
Download this release

Release Info

Developer DavidAnderson
Plugin Icon 128x128 UpdraftPlus WordPress Backup Plugin
Version 0.9.20
Comparing to
See all releases

Code changes from version 0.9.12 to 0.9.20

Files changed (2) hide show
  1. readme.txt +8 -8
  2. updraftplus.php +83 -12
readme.txt CHANGED
@@ -8,7 +8,7 @@ Donate link: http://david.dw-perspective.org.uk/donate
8
  License: GPLv3 or later
9
 
10
  == Upgrade Notice ==
11
- Updated to latest S3 library with patch for chunked uploading (for future use)
12
 
13
  == Description ==
14
 
@@ -18,10 +18,9 @@ UpdraftPlus simplifies backups (and restoration). Backup into the cloud (S3, Goo
18
 
19
  Standard WordPress plugin installation:
20
 
21
- 1. Upload updraftplus/ into wp-content/plugins/ (or use the built-in installers)
22
- 2. Activate the plugin via the 'Plugins' menu.
23
- 3. Go to the 'UpdraftPlus' option under settings.
24
- 4. Follow the instructions.
25
 
26
  == Frequently Asked Questions ==
27
 
@@ -54,7 +53,7 @@ Unless you disable any of these, it will back up your database (all tables which
54
  It does not back up WordPress core (since you can always get another copy of this from wordpress.org), and does not back up any extra files which you have added outside of the WordPress content directory (files which, by their nature, are unknown to WordPress). By default the WordPress content directory is "wp-content" in your WordPress root. It will not back up database tables which do not have the WordPress prefix (i.e. database tables from other applications but sharing a database with WordPress).
55
 
56
  = Any known bugs ? =
57
- The major one is that backups of very large sites (lots of uploaded media) can fail due to timing out. If your site is very large, then be doubly-sure to test when setting up that your backups are not empty. Since 0.9.0 there is a feature to re-try failed uploads on a separate scheduled run, which means UpdraftPlus should succeed for more sites than before (since we now only need enough time on each run to upload a single file, not all of them).
58
 
59
  = I encrypted my database - how do I decrypt it? =
60
 
@@ -70,8 +69,9 @@ Contact me! This is a complex plugin and the only way I can ensure it's robust i
70
 
71
  == Changelog ==
72
 
73
- = 0.9.12 - 12/06/2012 =
74
- * Updated to latest S3.php library with chunked uploading patch (for future improvements)
 
75
 
76
  = 0.9.10 - 11/22/2012 =
77
  * Completed basic Google Drive support (thanks to Sorin Iclanzan, code taken from "Backup" plugin under GPLv3+); now supporting uploading, purging and restoring - i.e. full UpdraftPlus functionality
8
  License: GPLv3 or later
9
 
10
  == Upgrade Notice ==
11
+ Chunked, resumable uploading with Amazon S3 - much bigger blogs can now be backed up
12
 
13
  == Description ==
14
 
18
 
19
  Standard WordPress plugin installation:
20
 
21
+ 1. Search for "UpdraftPlus" in your site's admin area plugin page
22
+ 2. Press 'Install'
23
+ 3. Go to the options page and go through the questions there
 
24
 
25
  == Frequently Asked Questions ==
26
 
53
  It does not back up WordPress core (since you can always get another copy of this from wordpress.org), and does not back up any extra files which you have added outside of the WordPress content directory (files which, by their nature, are unknown to WordPress). By default the WordPress content directory is "wp-content" in your WordPress root. It will not back up database tables which do not have the WordPress prefix (i.e. database tables from other applications but sharing a database with WordPress).
54
 
55
  = Any known bugs ? =
56
+ Not a bug as such, but one major issue to be aware of is that backups of very large sites (lots of uploaded media) can fail due to timing out. This depends on how manys seconds your web host allows a PHP process to run. With such sites, you need to use Amazon S3, which UpdraftPlus supports (since 0.9.20) with chunked, resumable uploads. All other backup methods have code (since 0.9.0) to retry failed uploads of an archive, but the upload cannot be chunked, so if an archive is enormous (i.e. cannot be completely uploaded in the time that PHP is allowed for running on your web host) it cannot work. Google Drive supports chunked, resumable uploads, but that code is not yet in UpdraftPlus (please send me a donation if you want me to hurry up!).
57
 
58
  = I encrypted my database - how do I decrypt it? =
59
 
69
 
70
  == Changelog ==
71
 
72
+ = 0.9.20 - 12/06/2012 =
73
+ * Updated to latest S3.php library with chunked uploading patch
74
+ * Implemented chunked uploading on Amazon S3 - much bigger sites can now be backed up with S3
75
 
76
  = 0.9.10 - 11/22/2012 =
77
  * Completed basic Google Drive support (thanks to Sorin Iclanzan, code taken from "Backup" plugin under GPLv3+); now supporting uploading, purging and restoring - i.e. full UpdraftPlus functionality
updraftplus.php CHANGED
@@ -4,7 +4,7 @@ Plugin Name: UpdraftPlus - Backup/Restore
4
  Plugin URI: http://wordpress.org/extend/plugins/updraftplus
5
  Description: Uploads, themes, plugins, and your DB can be automatically backed up to Amazon S3, Google Drive, FTP, or emailed, on separate schedules.
6
  Author: David Anderson.
7
- Version: 0.9.12
8
  Donate link: http://david.dw-perspective.org.uk/donate
9
  License: GPL3
10
  Author URI: http://wordshell.net
@@ -62,7 +62,7 @@ define('UPDRAFT_DEFAULT_OTHERS_EXCLUDE','upgrade,cache,updraft,index.php');
62
 
63
  class UpdraftPlus {
64
 
65
- var $version = '0.9.12';
66
 
67
  var $dbhandle;
68
  var $errors = array();
@@ -280,7 +280,7 @@ class UpdraftPlus {
280
  $our_files=$backup_history[$btime];
281
  $undone_files = array();
282
  foreach ($our_files as $key => $file) {
283
- $hash=md5($file);
284
  $fullpath = trailingslashit(get_option('updraft_dir')).$file;
285
  if (get_transient('updraft_'.$hash) === "yes") {
286
  $this->log("$file: $key: This file has been successfully uploaded in the last 3 hours");
@@ -421,7 +421,7 @@ class UpdraftPlus {
421
  delete_transient("updraftplus_backup_job_nonce");
422
  delete_transient("updraftplus_backup_job_time");
423
  } else {
424
- $this->log("There were errors in the uploads, so the 'resume' event is remaining unscheduled");
425
  }
426
 
427
  @fclose($this->logfile_handle);
@@ -465,7 +465,7 @@ class UpdraftPlus {
465
  function uploaded_file($file, $id = false) {
466
  # We take an MD5 hash because set_transient wants a name of 45 characters or less
467
  $hash = md5($file);
468
- set_transient("updraft_".$hash, "yes", 3600*3);
469
  if ($id) {
470
  $ids = get_option('updraft_file_ids', array() );
471
  $ids[$file] = $id;
@@ -623,27 +623,94 @@ class UpdraftPlus {
623
  $this->log("Retain: saving new backup history (sets now: ".count($backup_history).") and finishing retain operation");
624
  update_option('updraft_backup_history',$backup_history);
625
  }
626
-
627
  function s3_backup($backup_array) {
 
628
  if(!class_exists('S3')) require_once(dirname(__FILE__).'/includes/S3.php');
629
  $s3 = new S3(get_option('updraft_s3_login'), get_option('updraft_s3_pass'));
 
630
  $bucket_name = untrailingslashit(get_option('updraft_s3_remote_path'));
631
  $bucket_path = "";
632
  $orig_bucket_name = $bucket_name;
 
633
  if (preg_match("#^([^/]+)/(.*)$#",$bucket_name,$bmatches)) {
634
  $bucket_name = $bmatches[1];
635
  $bucket_path = $bmatches[2]."/";
636
  }
 
 
637
  if (@$s3->putBucket($bucket_name, S3::ACL_PRIVATE)) {
638
  foreach($backup_array as $file) {
 
 
639
  $fullpath = trailingslashit(get_option('updraft_dir')).$file;
640
- $this->log("S3 upload: $fullpath -> s3://$bucket_name/$bucket_path$file");
641
- if (!$s3->putObjectFile($fullpath, $bucket_name, $bucket_path.$file)) {
642
- $this->log("S3 upload: failed");
643
- $this->error("S3 Error: Failed to upload $fullpath. Error was ".$php_errormsg);
 
 
 
 
 
 
 
 
 
 
 
 
644
  } else {
645
- $this->log("S3 upload: success");
646
- $this->uploaded_file($file);
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
647
  }
648
  }
649
  $this->prune_retained_backups('s3',$s3,$orig_bucket_name);
@@ -2125,6 +2192,10 @@ ENDHERE;
2125
  </tr>
2126
 
2127
  <!-- Amazon S3 -->
 
 
 
 
2128
  <tr class="s3" <?php echo $s3_display?>>
2129
  <th>S3 access key:</th>
2130
  <td><input type="text" autocomplete="off" style="width:292px" name="updraft_s3_login" value="<?php echo get_option('updraft_s3_login') ?>" /></td>
4
  Plugin URI: http://wordpress.org/extend/plugins/updraftplus
5
  Description: Uploads, themes, plugins, and your DB can be automatically backed up to Amazon S3, Google Drive, FTP, or emailed, on separate schedules.
6
  Author: David Anderson.
7
+ Version: 0.9.20
8
  Donate link: http://david.dw-perspective.org.uk/donate
9
  License: GPL3
10
  Author URI: http://wordshell.net
62
 
63
  class UpdraftPlus {
64
 
65
+ var $version = '0.9.20';
66
 
67
  var $dbhandle;
68
  var $errors = array();
280
  $our_files=$backup_history[$btime];
281
  $undone_files = array();
282
  foreach ($our_files as $key => $file) {
283
+ $hash = md5($file);
284
  $fullpath = trailingslashit(get_option('updraft_dir')).$file;
285
  if (get_transient('updraft_'.$hash) === "yes") {
286
  $this->log("$file: $key: This file has been successfully uploaded in the last 3 hours");
421
  delete_transient("updraftplus_backup_job_nonce");
422
  delete_transient("updraftplus_backup_job_time");
423
  } else {
424
+ $this->log("There were errors in the uploads, so the 'resume' event is remaining scheduled");
425
  }
426
 
427
  @fclose($this->logfile_handle);
465
  function uploaded_file($file, $id = false) {
466
  # We take an MD5 hash because set_transient wants a name of 45 characters or less
467
  $hash = md5($file);
468
+ set_transient("updraft_".$hash, "yes", 3600*4);
469
  if ($id) {
470
  $ids = get_option('updraft_file_ids', array() );
471
  $ids[$file] = $id;
623
  $this->log("Retain: saving new backup history (sets now: ".count($backup_history).") and finishing retain operation");
624
  update_option('updraft_backup_history',$backup_history);
625
  }
626
+
627
  function s3_backup($backup_array) {
628
+
629
  if(!class_exists('S3')) require_once(dirname(__FILE__).'/includes/S3.php');
630
  $s3 = new S3(get_option('updraft_s3_login'), get_option('updraft_s3_pass'));
631
+
632
  $bucket_name = untrailingslashit(get_option('updraft_s3_remote_path'));
633
  $bucket_path = "";
634
  $orig_bucket_name = $bucket_name;
635
+
636
  if (preg_match("#^([^/]+)/(.*)$#",$bucket_name,$bmatches)) {
637
  $bucket_name = $bmatches[1];
638
  $bucket_path = $bmatches[2]."/";
639
  }
640
+
641
+
642
  if (@$s3->putBucket($bucket_name, S3::ACL_PRIVATE)) {
643
  foreach($backup_array as $file) {
644
+
645
+ // We upload in 5Mb chunks to allow more efficient resuming and hence uploading of larger files
646
  $fullpath = trailingslashit(get_option('updraft_dir')).$file;
647
+ $chunks = floor(filesize($fullpath) / 5242880)+1;
648
+ $hash = md5($file);
649
+
650
+ $this->log("S3 upload: $fullpath (chunks: $chunks) -> s3://$bucket_name/$bucket_path$file");
651
+
652
+ $filepath = $bucket_path.$file;
653
+
654
+ // This is extra code for the 1-chunk case, but less overhead (no bothering with transients)
655
+ if ($chunks < 2) {
656
+ if (!$s3->putObjectFile($fullpath, $bucket_name, $filepath)) {
657
+ $this->log("S3 regular upload: failed");
658
+ $this->error("S3 Error: Failed to upload $fullpath. Error was ".$php_errormsg);
659
+ } else {
660
+ $this->log("S3 regular upload: success");
661
+ $this->uploaded_file($file);
662
+ }
663
  } else {
664
+
665
+ // Retrieve the upload ID
666
+ $uploadId = get_transient("updraft_${hash}_uid");
667
+ if (empty($uploadId)) {
668
+ $uploadId = $s3->initiateMultipartUpload($bucket_name, $filepath);
669
+ if (empty($uploadId)) {
670
+ $this->log("S3 upload: failed: could not get uploadId for multipart upload");
671
+ continue;
672
+ } else {
673
+ $this->log("S3 chunked upload: got multipart ID: $uploadId");
674
+ set_transient("updraft_${hash}_uid", $uploadId, 3600*3);
675
+ }
676
+ } else {
677
+ $this->log("S3 chunked upload: retrieved previously obtained multipart ID: $uploadId");
678
+ }
679
+
680
+ $successes = 0;
681
+ $etags = array();
682
+ for ($i = 1 ; $i <= $chunks; $i++) {
683
+ # Shorted to upd here to avoid hitting the 45-character limit
684
+ $etag = get_transient("upd_${hash}_e$i");
685
+ if (strlen($etag) > 0) {
686
+ $this->log("S3 chunk $i: was already completed (etag: $etag)");
687
+ $successes++;
688
+ array_push($etags, $etag);
689
+ } else {
690
+ $etag = $s3->uploadPart($bucket_name, $filepath, $uploadId, $fullpath, $i);
691
+ if (is_string($etag)) {
692
+ $this->log("S3 chunk $i: uploaded (etag: $etag)");
693
+ array_push($etags, $etag);
694
+ set_transient("upd_${hash}_e$i", $etag, 3600*3);
695
+ $successes++;
696
+ } else {
697
+ $this->error("S3 chunk $i: upload failed");
698
+ $this->log("S3 chunk $i: upload failed");
699
+ }
700
+ }
701
+ }
702
+ if ($successes >= $chunks) {
703
+ $this->log("S3 upload: all chunks uploaded; will now instruct S3 to re-assemble");
704
+ if ($s3->completeMultipartUpload ($bucket_name, $filepath, $uploadId, $etags)) {
705
+ $this->log("S3 upload: re-assembly succeeded");
706
+ $this->uploaded_file($file);
707
+ } else {
708
+ $this->log("S3 upload: re-assembly failed");
709
+ $this->error("S3 upload: re-assembly failed");
710
+ }
711
+ } else {
712
+ $this->log("S3 upload: upload was not completely successful on this run");
713
+ }
714
  }
715
  }
716
  $this->prune_retained_backups('s3',$s3,$orig_bucket_name);
2192
  </tr>
2193
 
2194
  <!-- Amazon S3 -->
2195
+ <tr class="s3" <?php echo $s3_display?>>
2196
+ <td></td>
2197
+ <td><em>Amazon S3 is a great choice, because UpdraftPlus supports chunked uploads - no matter how big your blog is, UpdraftPlus can upload it a little at a time, and not get thwarted by timeouts.</em></td>
2198
+ </tr>
2199
  <tr class="s3" <?php echo $s3_display?>>
2200
  <th>S3 access key:</th>
2201
  <td><input type="text" autocomplete="off" style="width:292px" name="updraft_s3_login" value="<?php echo get_option('updraft_s3_login') ?>" /></td>