enupal / backup Goto Github PK
View Code? Open in Web Editor NEWFully integrated Backup solution for Craft CMS
Home Page: https://enupal.com/craft-plugins/enupal-backup
License: Other
Fully integrated Backup solution for Craft CMS
Home Page: https://enupal.com/craft-plugins/enupal-backup
License: Other
Like the title says, with the recent changes in 3.1, all plugins storing sensitive information (s3 credentials in our case) should support project config and environmental variables.
When I run a backup, only the db is backed up. I expected templates, assets etc to also be backed up.
Here's my settings:
Database: Yes
Templates: Yes
Assets: Yes
Config Files: Yes
Web Root Directory: No
Running a trial version locally.
Plugin install fails.
Installing normally from CP. Then installation fails:
Install Craft with composer 1.8.0 on a MacPro HighSierra 10.13.4
After installation (All Ok) directly go to Plugin Store / Enupal Backup / Install
Installation failed.
This is the 2nd error I get.
Ive tried do the same in another Mac with Yosemite.
Same error.
Database Exception: SQLSTATE[42S22]: Column not found: 1054 Unknown column 'settings' in 'field list'
The SQL being executed was: UPDATE plugins SET settings='{"compressWithBz2":1,"pluginNameOverride":null,"backupsAmount":5,"deleteLocalBackupAfterUpload":false,"enableDatabase":true,"excludeData":"assetindexdata, assettransformindex, cache, sessions, templatecaches, templatecachecriteria, templatecacheelements","enableTemplates":false,"excludeTemplates":"","enableConfigFiles":false,"excludeConfigFiles":"cpresources,","enableLogs":false,"excludeLogs":"enupalbackup,","enableWebFolder":false,"excludeWebFolder":"cpresources,","enableLocalVolumes":false,"volumes":null,"enableDropbox":false,"dropboxToken":null,"dropboxPath":"\/enupalbackup\/","enableGoogleDrive":false,"googleDriveClientId":null,"googleDriveClientSecret":null,"googleDriveFolder":"enupalbackup\/","enableAmazon":false,"amazonKey":null,"amazonSecret":null,"amazonBucket":null,"amazonRegion":null,"amazonPath":"\/enupalbackup\/","amazonUseMultiPartUpload":false,"enableFtp":false,"ftpType":"ftp","ftpHost":null,"ftpUser":null,"ftpPassword":null,"ftpPath":"enupalbackup\/","enableSos":false,"sosUser":null,"sosSecret":null,"sosHost":null,"sosContainer":null,"sosPath":"\/enupalbackup\/","enablePathToTar":false,"pathToTar":null,"enablePathToPhp":null,"pathToPhp":null,"enablePathToMysqldump":false,"pathToMysqldump":null,"enablePathToOpenssl":false,"pathToOpenssl":null,"enablePathToPgdump":false,"pathToPgdump":null,"enableWebhook":false,"webhookSecretKey":null,"enableOpenssl":false,"opensslPassword":null,"enableNotification":false,"emailTemplateOverride":null,"notificationSubject":null,"notificationRecipients":null,"notificationSenderName":null,"notificationSenderEmail":null,"notificationReplyToEmail":null,"maxExecutionTime":3600,"primarySiteUrl":"$DEFAULT_SITE_URL"}', dateUpdated='2019-01-22 20:05:45' WHERE handle='enupal-backup'
1.Install Craft withcomposer
2.Plugin Store / Enupal Backup / install
I'm developing on windows, while staging and production is on Linux.
I would like the option to enable/disable paths based on environment variables.
It could be done by saying "if environment variable is empty, use system default"
The Yii 2 queue package has drivers for lots of different queue services.
https://github.com/yiisoft/yii2-queue
If you've configured Craft to use one of them (SQS for example), you'll get the following error:
ArgumentCountError
Too few arguments to function yii\queue\sqs\Queue::run(), 0 passed in /var/app/current/vendor/enupal/backup/src/services/Backups.php on line 98 and at least 1 expected
1. in /var/app/current/vendor/yiisoft/yii2-queue/src/drivers/sqs/Queue.phpat line 88
79808182838485868788899091929394959697
/**
* Listens queue and runs each job.
*
* @param bool $repeat whether to continue listening when queue is empty.
* @param int $timeout number of seconds to sleep before next iteration.
* @return null|int exit code.
* @internal for worker command only
*/
public function run($repeat, $timeout = 0)
{
return $this->runWorker(function (callable $canContinue) use ($repeat, $timeout) {
while ($canContinue()) {
if (($payload = $this->reserve($timeout)) !== null) {
$id = $payload['MessageId'];
$message = $payload['Body'];
$ttr = (int) $payload['MessageAttributes']['TTR']['StringValue'];
$attempt = (int) $payload['Attributes']['ApproximateReceiveCount'];
if ($this->handleMessage($id, $message, $ttr, $attempt)) {
2. in /var/app/current/vendor/enupal/backup/src/services/Backups.php at line 98– yii\queue\sqs\Queue::run()
9293949596979899100101102103104 }
} else {
// if is Linux try to call queue/run
if ($settings->runJobInBackground){
$this->runQueueInBackground();
}else{
Craft::$app->getQueue()->run();
}
$response['message'] = 'running';
}
return $response;
}
3. in /var/app/current/vendor/enupal/backup/src/controllers/BackupsController.php at line 182– enupal\backup\services\Backups::executeEnupalBackup()
This comes from https://github.com/enupal/backup/blob/master/src/services/Backups.php#L98
The QueueInterface class that specifies the run method is Craft's class that the Yii2 queue libraries have no knowledge of.
I guess I'm not seeing why you wouldn't let the user/Craft decide when to process the queue.
Generally speaking, you don't want to manually trigger Craft's queue to run. i.e. on most installs, if someone triggers a manually backup from Craft's control panel, it'll start getting processed via the web-based queue runner when the page refreshes.
And if someone has set up a daemon or cron job to process Craft's queue, it'll get triggered immediately or whenever the next cron job is scheduled to run.
Seems like this plugin should just be pushing the job to the queue and calling it a day?
The database export makes it to dropbox, but not asset files. Our userphotos (which is empty) is the only folder with a tar file that makes it to dropbox. It will say the backup is complete and does not report any errors. I've checked logs and can't find anything. The backup says it was successful.
Note: Our web/uploaded_assets
folder is a symlink: uploaded_assets -> ../../../shared/web/uploaded_assets
. I think it's possible the plugin isn't able to follow symlinks?
uploaded_assets
folder behind a symlinkNotify
Feature request
It would be great if you had an auto prune setting so that my backups wouldn't just be added to my s3 indefinitely. If I was running my cron twice a day I could set a number it would prune older backups so I could have one months backups at all time.
Also if I delete a backup from control panel it does not delete from my bucket. is this intentional?
I'm hosting my site at Hyperlane. The URL structure of a site in dev environment looks like this:
https://dev-sitename-[random-uid].hyperlane.co/
When I try to call the URL for the webhook from an Azure Logic App, it times out.
So my question is: Is this supposed to work in trial mode (I haven't purchased Enupal Backup yet)?
It is working great on my online website but not locally on Mac / Mamp.
I can see the backup file has been generated (in the backup folder) and they are listed in Craft Enupal backups... but not size and when I click on them they is nothing :
Here is the error :
{"status":1,"timestamp":1556878177,"duration":0.27,"backupCount":11,"backupFailed":1,"errorCount":1,"errors":[{"class":"RuntimeException","message":"'mysqldump' was nowhere to be found please specify the correct path","file":"phar:///Users/migswd/Workspace/newmoowon/www/vendor/enupal/backup/src/resources/phpbu.phar/Util/Cli.php","line":110}],"backups":[{"name":"Database","status":1,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Asset2","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Asset3","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Asset4","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Asset5","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Asset6","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Config1","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Config2","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Templates","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Web Folder","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}},{"name":"Logs","status":0,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":0,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}}],"debug":["exception: 'mysqldump' was nowhere to be found please specify the correct path","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/assets/assets-general-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Applications/MAMP/htdocs/newmoowon/assets' 'general'","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/assets/assets-storiesImages-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Applications/MAMP/htdocs/newmoowon/assets' 'stories'","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/assets/assets-authorsImages-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Applications/MAMP/htdocs/newmoowon/assets' 'authors'","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/assets/assets-campaignsImages-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Applications/MAMP/htdocs/newmoowon/assets' 'campaigns'","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/assets/assets-usersImages-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Applications/MAMP/htdocs/newmoowon/assets' 'users'","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/config/config-configFolder-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Users/migswd/Workspace/newmoowon/www' 'config'","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/config/config-composerFile-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Users/migswd/Workspace/newmoowon/www/storage/runtime/temp' 'enupal-backup-composer'","/usr/bin/tar --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/templates/templates-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Users/migswd/Workspace/newmoowon/www' 'templates'","/usr/bin/tar --exclude='cpresources' --exclude='' --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/web/web-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Applications/MAMP/htdocs' 'newmoowon'","/usr/bin/tar --exclude='enupalbackup' --exclude='' --ignore-failed-read --force-local -cf '/Users/migswd/Workspace/newmoowon/www/storage/enupalbackup/logs/logs-new_moowon_20190503120937_ojyz2zm15w.tar' -C '/Users/migswd/Workspace/newmoowon/www/storage' 'logs'"]}
Jquery is not defined on console errors
When visiting the Enupal Backup (EB) Settings for Scheduling, the example cron job provided uses a server path instead of a webhook url. When testing the curl command, it errors out. But changing the curl path to a public URL allows the backup job to queue properly.
When I visit:
https://website.ca/admin/enupal-backup/settings/schedule
The cron job example provided looks like this:
# Enupal Backup Webhook
10 3 * * * curl --request GET '/svr/www/website.ca/web/enupal-backup/schedule?key=sldhj4ksuidh'
11 3 * * * cd /svr/www/website.ca && php craft queue/run
However running that curl command errors out: curl: (3) <url> malformed
To solve this I have to change the URL path in the cronjob to a publicly accessible URL like this:
# Enupal Backup Webhook
10 3 * * * curl --request GET 'https://website.ca/enupal-backup/schedule?key=sldhj4ksuidh'
11 3 * * * cd /svr/www/website.ca && php craft queue/run
Not sure if this is a problem with my config of EB, or something EB is doing instead of showing the site URL in the cronjob example.
I have a cron job setup to backup our database to an Amazon S3 bucket via a webook.
As of about a week ago, our server has been stalling out as soon as the scheduled backup starts. The server resumes functioning as normal within a few minutes.
Our site currently has 24 sites set up for translation. DB backups are just over 40MB zipped.
Any idea what could be causing our issues?
I reached out on Discord and someone mentioned this:
As far as I can see there are no console controllers for the plugin which is a little bit strange for such tasks. I bet the request times out.
Is there a guide for setting up S3 bucket access and bucket policy?
Amazon makes it impossible to understand or figure out and I'm not sure what needs to be set up to work with Enupal backup
Craft 4.4.16.1
Enupal Backups 2.1.0
I've got Enupal working so that it backups up everything. However when I check the S3 bucket the assets are not there, though I can see the database, logs, and config files in S3.
Also when I ran the backup I did not receive an email despite having added my email address and enabling to the notifications option. The website does successfully send emails via the freeform plugin.
Please advise.
The Amazon S3 path is very sensitive to whether or not you include leading and/or trailing forward slashes.
If you enter this path: /production/
the backups will end up in the following path: "blank folder"/production
the "blank folder" possibly being created because I assume the full path looks something like this: https://amazon.something/bucketName//production/
If you enter this path: test/production
, the backups will end up inside the test
folder with production
as a part of the file name(s). Again, as the above statement, I assume this has to to with the composition of the path.
Specification of paths in settings have the same issue. Some paths require trailing slashes, some don't. Some require you to enter full path to executable, some require just the path to the location of the executable.
I set up a new client site and when I run the backup it fails. Have attached the error message from the CP
error log:
02 - Could not create Enupal Backup: The shell command "cd /home/site/site.com/vendor/enupal/backup/src/resources && php phpbu.phar --configuration=/home/site/site.com/storage/enupalbackup/config.json --debug" failed with exit code 2: phpbu 6.0.20 by Sebastian Feldmann and contributors. Runtime: PHP 8.2.12 Configuration: /home/site/site.com/storage/enupalbackup/config.json backup: [mysqldump] ******************************************************* ("/usr/bin/mysqldump" --user='torquest' --password='******' --host='localhost' --port='3306' 'torquest' --no-data && "/usr/bin/mysqldump" --user='torquest' --password='******' --host='localhost' --port='3306' 'torquest' --ignore-table='torquest.assetindexdata' --ignore-table='torquest.assettransformindex' --ignore-table='torquest.cache' --ignore-table='torquest.sessions' --ignore-table='torquest.templatecaches' --ignore-table='torquest.templatecachecriteria' --ignore-table='torquest.templatecacheelements' --skip-add-drop-table --no-create-db --no-create-info) | "/usr/bin/bzip2" > /home/site/site.com/storage/enupalbackup/databases/database-english_20231121110656_n4yr2g2v8v.sql.bz2 ok sync: [amazons3] ********************************************************** Deprecated: strtolower(): Passing null to parameter #1 ($string) of type string is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/Util/Str.php on line 31 Deprecated: strtolower(): Passing null to parameter #1 ($string) of type string is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/Util/Str.php on line 33 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/S3/RegionalEndpoint/ConfigurationProvider.php on line 83 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/DefaultsMode/ConfigurationProvider.php on line 85 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/Endpoint/UseFipsEndpoint/ConfigurationProvider.php on line 82 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/Endpoint/UseDualstackEndpoint/ConfigurationProvider.php on line 83 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/EndpointDiscovery/ConfigurationProvider.php on line 86 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/Retry/ConfigurationProvider.php on line 88 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/ClientSideMonitoring/ConfigurationProvider.php on line 90 Deprecated: Use of "self" in callables is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/S3/UseArnRegion/ConfigurationProvider.php on line 83 Deprecated: Creation of dynamic property GuzzleHttp\Handler\CurlMultiHandler::$_mh is deprecated in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/guzzlehttp/guzzle/Handler/CurlMultiHandler.php on line 103 create s3 bucket exception: Error executing "CreateBucket" on "https://torquest-backup.s3.ca-central-1.amazonaws.com/"; AWS HTTP error: Client error: `PUT https://torquest-backup.s3.ca-central-1.amazonaws.com/` resulted in a `403 Forbidden` response: <?xml version="1.0" encoding="UTF-8"?> <Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>RCZ64S (truncated...) AccessDenied (client): Access Denied - <?xml version="1.0" encoding="UTF-8"?> <Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>RCZ64SDWRRMBDXXN</RequestId><HostId>FbTSaoewzUrJfsjTm9z6lYY2RLvWlGozku9Qt8de0s4l6oQPFXiGDJoW24LYVSZSlxaSWUyVW2Y=</HostId></Error> failed Warning: file_get_contents(http://torquest.com/enupal-backup/finished?backupId=english_20231121110656_n4yr2g2v8v&status=1×tamp=1700582816&duration=7.2131&err-cnt=1&bak-cnt=1&bak-fail=1): Failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/Log/Webhook.php on line 252 Time: 7 seconds, Memory: 18.43MB Exception 'Aws\S3\Exception\S3Exception' with message 'Error executing "CreateBucket" on "https://torquest-backup.s3.ca-central-1.amazonaws.com/"; AWS HTTP error: Client error: `PUT https://torquest-backup.s3.ca-central-1.amazonaws.com/` resulted in a `403 Forbidden` response: <?xml version="1.0" encoding="UTF-8"?> <Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>RCZ64S (truncated...) AccessDenied (client): Access Denied - <?xml version="1.0" encoding="UTF-8"?> <Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>RCZ64SDWRRMBDXXN</RequestId><HostId>FbTSaoewzUrJfsjTm9z6lYY2RLvWlGozku9Qt8de0s4l6oQPFXiGDJoW24LYVSZSlxaSWUyVW2Y=</HostId></Error>' in phar:///home/site/site.com/vendor/enupal/backup/src/resources/phpbu.phar/lib/aws-sdk/WrappedHttpHandler.php:195 backup Database: FAILED | executed | skipped | failed | ----------+----------+---------+--------+ checks | 0 | | 0 | crypts | 0 | 0 | 0 | syncs | 1 | 0 | 0 | cleanups | 0 | 0 | 0 | ----------+----------+---------+--------+ FAILURE! Backups: 1, failed Checks: 0, failed Crypts: 0, failed Syncs: 0, failed Cleanups: 0. --Trace: #0 /home/site/site.com/vendor/enupal/backup/src/services/Backups.php(279): craft\errors\ShellCommandException::createFromCommand() #1 /home/site/site.com/vendor/enupal/backup/src/jobs/CreateBackup.php(51): enupal\backup\services\Backups->enupalBackup() #2 /home/site/site.com/vendor/yiisoft/yii2-queue/src/Queue.php(243): enupal\backup\jobs\CreateBackup->execute() #3 /home/site/site.com/vendor/yiisoft/yii2-queue/src/cli/Queue.php(147): yii\queue\Queue->handleMessage() #4 /home/site/site.com/vendor/craftcms/cms/src/queue/Queue.php(190): yii\queue\cli\Queue->handleMessage() #5 /home/site/site.com/vendor/craftcms/cms/src/queue/Queue.php(165): craft\queue\Queue->executeJob() #6 [internal function]: craft\queue\Queue->craft\queue\{closure}() #7 /home/site/site.com/vendor/yiisoft/yii2-queue/src/cli/Queue.php(114): call_user_func() #8 /home/site/site.com/vendor/craftcms/cms/src/queue/Queue.php(163): yii\queue\cli\Queue->runWorker() #9 /home/site/site.com/vendor/craftcms/cms/src/controllers/QueueController.php(82): craft\queue\Queue->run() #10 [internal function]: craft\controllers\QueueController->actionRun() #11 /home/site/site.com/vendor/yiisoft/yii2/base/InlineAction.php(57): call_user_func_array() #12 /home/site/site.com/vendor/yiisoft/yii2/base/Controller.php(178): yii\base\InlineAction->runWithParams() #13 /home/site/site.com/vendor/yiisoft/yii2/base/Module.php(552): yii\base\Controller->runAction() #14 /home/site/site.com/vendor/craftcms/cms/src/web/Application.php(305): yii\base\Module->runAction() #15 /home/site/site.com/vendor/craftcms/cms/src/web/Application.php(606): craft\web\Application->runAction() #16 /home/site/site.com/vendor/craftcms/cms/src/web/Application.php(284): craft\web\Application->_processActionRequest() #17 /home/site/site.com/vendor/yiisoft/yii2/base/Application.php(384): craft\web\Application->handleRequest() #18 /home/site/site.com/public/index.php(21): yii\base\Application->run() #19 {main}
logs file
{"status":1,"timestamp":1700582816,"duration":7.2142,"backupCount":1,"backupFailed":1,"errorCount":1,"errors":[{"class":"Aws\\S3\\Exception\\S3Exception","message":"Error executing \"CreateBucket\" on \"https:\/\/torquest-backup.s3.ca-central-1.amazonaws.com\/\"; AWS HTTP error: Client error: `PUT https:\/\/torquest-backup.s3.ca-central-1.amazonaws.com\/` resulted in a `403 Forbidden` response:\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AccessDenied<\/Code><Message>Access Denied<\/Message><RequestId>RCZ64S (truncated...)\n AccessDenied (client): Access Denied - <?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AccessDenied<\/Code><Message>Access Denied<\/Message><RequestId>RCZ64SDWRRMBDXXN<\/RequestId><HostId>FbTSaoewzUrJfsjTm9z6lYY2RLvWlGozku9Qt8de0s4l6oQPFXiGDJoW24LYVSZSlxaSWUyVW2Y=<\/HostId><\/Error>","file":"phar:\/\/\/home\/torquest\/torquest.com\/vendor\/enupal\/backup\/src\/resources\/phpbu.phar\/lib\/aws-sdk\/WrappedHttpHandler.php","line":195}],"backups":[{"name":"Database","status":1,"checks":{"executed":0,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":1,"skipped":0,"failed":0},"cleanup":{"executed":0,"skipped":0,"failed":0}}],"debug":["(\"\/usr\/bin\/mysqldump\" --user='torquest' --password='******' --host='localhost' --port='3306' 'torquest' --no-data && \"\/usr\/bin\/mysqldump\" --user='torquest' --password='******' --host='localhost' --port='3306' 'torquest' --ignore-table='torquest.assetindexdata' --ignore-table='torquest.assettransformindex' --ignore-table='torquest.cache' --ignore-table='torquest.sessions' --ignore-table='torquest.templatecaches' --ignore-table='torquest.templatecachecriteria' --ignore-table='torquest.templatecacheelements' --skip-add-drop-table --no-create-db --no-create-info) | \"\/usr\/bin\/bzip2\" > \/home\/torquest\/torquest.com\/storage\/enupalbackup\/databases\/database-english_20231121110656_n4yr2g2v8v.sql.bz2","create s3 bucket","exception: Error executing \"CreateBucket\" on \"https:\/\/torquest-backup.s3.ca-central-1.amazonaws.com\/\"; AWS HTTP error: Client error: `PUT https:\/\/torquest-backup.s3.ca-central-1.amazonaws.com\/` resulted in a `403 Forbidden` response:\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AccessDenied<\/Code><Message>Access Denied<\/Message><RequestId>RCZ64S (truncated...)\n AccessDenied (client): Access Denied - <?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AccessDenied<\/Code><Message>Access Denied<\/Message><RequestId>RCZ64SDWRRMBDXXN<\/RequestId><HostId>FbTSaoewzUrJfsjTm9z6lYY2RLvWlGozku9Qt8de0s4l6oQPFXiGDJoW24LYVSZSlxaSWUyVW2Y=<\/HostId><\/Error>"]}
Waiting info from: craftcms/cms#3055
Settings are done locally and update project yaml files. I push those changes to production and apply. All good.
However I can still update the settings on production, this should be disabled as it is with Craft and other plugins
See screenshot from production site. If I update anything here, it will update project yaml files and mess up the git state of the server.
Any settings that update yaml files should be disabled in production.
It would be great to have a setting where the plugin will send an email if the backup fails. Getting an email if the backup is successful is less useful, since you start to ignore it after a while.
Could you explain some pieces of the example Cronjob given in the plugin?
10 3 * * * wget https://www.mysite.com/enupal-backup/schedule?key=12345 -O /dev/null
Specifically what does -O
do and what is /dev/null
for?
Something has changed since last version. Now receiving this error:
02 - Could not create Enupal Backup: The shell command "cd /chroot/home/ae765bea/brownbros.co.nz/vendor/enupal/backup/src/resources && /opt/remi/php73/root/usr/bin/php phpbu.phar --configuration=/chroot/home/ae765bea/brownbros.co.nz/storage/enupalbackup/config.json --debug" failed with exit code 255: phpbu 6.0.20 by Sebastian Feldmann and contributors. Runtime: PHP 7.3.33 Configuration: /chroot/home/ae765bea/brownbros.co.nz/storage/enupalbackup/config.json backup: [mysqldump] ******************************************************* --Trace: #0 /chroot/home/ae765bea/brownbros.co.nz/vendor/enupal/backup/src/services/Backups.php(279): craft\errors\ShellCommandException::createFromCommand(Object(mikehaertl\shellcommand\Command)) #1 /chroot/home/ae765bea/brownbros.co.nz/vendor/enupal/backup/src/jobs/CreateBackup.php(51): enupal\backup\services\Backups->enupalBackup(Object(enupal\backup\elements\Backup)) #2 /chroot/home/ae765bea/brownbros.co.nz/vendor/yiisoft/yii2-queue/src/Queue.php(243): enupal\backup\jobs\CreateBackup->execute(Object(craft\queue\Queue)) #3 /chroot/home/ae765bea/brownbros.co.nz/vendor/yiisoft/yii2-queue/src/cli/Queue.php(147): yii\queue\Queue->handleMessage('99956', 'O:31:"enupal\ba...', '300', 1) #4 /chroot/home/ae765bea/brownbros.co.nz/vendor/craftcms/cms/src/queue/Queue.php(131): yii\queue\cli\Queue->handleMessage('99956', 'O:31:"enupal\ba...', '300', 1) #5 [internal function]: craft\queue\Queue->craft\queue{closure}(Object(Closure)) #6 /chroot/home/ae765bea/brownbros.co.nz/vendor/yiisoft/yii2-queue/src/cli/Queue.php(117): call_user_func(Object(Closure), Object(Closure)) #7 /chroot/home/ae765bea/brownbros.co.nz/vendor/craftcms/cms/src/queue/Queue.php(140): yii\queue\cli\Queue->runWorker(Object(Closure)) #8 /chroot/home/ae765bea/brownbros.co.nz/vendor/craftcms/cms/src/controllers/QueueController.php(84): craft\queue\Queue->run() #9 [internal function]: craft\controllers\QueueController->actionRun() #10 /chroot/home/ae765bea/brownbros.co.nz/vendor/yiisoft/yii2/base/InlineAction.php(57): call_user_func_array(Array, Array) #11 /chroot/home/ae765bea/brownbros.co.nz/vendor/yiisoft/yii2/base/Controller.php(178): yii\base\InlineAction->runWithParams(Array) #12 /chroot/home/ae765bea/brownbros.co.nz/vendor/yiisoft/yii2/base/Module.php(552): yii\base\Controller->runAction('run', Array) #13 /chroot/home/ae765bea/brownbros.co.nz/vendor/craftcms/cms/src/web/Application.php(295): yii\base\Module->runAction('queue/run', Array) #14 /chroot/home/ae765bea/brownbros.co.nz/vendor/craftcms/cms/src/web/Application.php(608): craft\web\Application->runAction('queue/run', Array) #15 /chroot/home/ae765bea/brownbros.co.nz/vendor/craftcms/cms/src/web/Application.php(274): craft\web\Application->_processActionRequest(Object(craft\web\Request)) #16 /chroot/home/ae765bea/brownbros.co.nz/vendor/yiisoft/yii2/base/Application.php(384): craft\web\Application->handleRequest(Object(craft\web\Request)) #17 /chroot/home/ae765bea/brownbros.co.nz/html/index.php(21): yii\base\Application->run() #18 {main}
Hi.
I am trying to do the cron job task to backup my website but it does not work.
I tested the link and it is working (I activated the cron job config).
https://myapp.com/enupal-backup/schedule?key=OLmCV46wS9
I got this line in my anacrontab :
10 3 * * * wget https://myapp.com/enupal-backup/schedule?key=OLmCV46wS9 -O /dev/null
Where should I look to understand what is wrong ?
Is there any log in the plugin ? (I checked but did not see it)
PHP version | 7.3.0
Linux 4.9.124-paas-2270098
MySQL 5.7.23
Craft Pro 3.1.22
Enupal Backup 1.2.10
This is somewhat two issues, but also closely related. The first issue is that in environments where allowAdminChanges
is set to false, the plugin still allows you to adjust settings that would write to the project config files. The plugin also doesn't seem to respect these config files when they are pushed to the environment.
This also causes there to be no workaround on ephemeral filesystems where files cannot be updated. Configuration for the plugin can be set up locally, but the config does not get applied to the production server and cannot be updated without updating the database which is not really an option for production.
Hi,
I have a question about the documentation.
If I use S3 Amazon to store my backup, does it remove old backup in the bucket function of the amount or only locally ?
Thanks !
Add the option to backup all files under the webroot like:
The backup doesn't work, I'm using the latest version of Craft (3.1.32.1). The backup is stuck on status 'running' and never completes. I've emailed support for help but they haven't resolved the issue, I can't recommend this plugin.
I can no longer authenticate with Google Drive
I had been backing up successfully for weeks, and then it stopped working, when I went to check the credentials, I got the following error when I click on the Google Drive option:
{
"error": {
"code": 401,
"message": "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"errors": [
{
"message": "Invalid Credentials",
"domain": "global",
"reason": "authError",
"location": "Authorization",
"locationType": "header"
}
],
"status": "UNAUTHENTICATED"
}
}
There was no way to remove the credentials, so I had to uninstall the plugin and start over.
When I added the original credentials back, the error re-appeared.
I Uninstalled again, reinstalled, created NEW credentials, and I got the same error. The credentials are enabled and working on my Google Project, and the Google Drive API is enabled.
The redirect URI is correct as well as the javascript origin
After installing, and activating, I went to create my first backup. I clicked backup and the loading spinner started spinning on the browser tab. Nothing happened for about 5 minutes so I tried:
I closed the browser and reopened it and was able to navigate around the admin panel.
I'm sure its a server issue but what is happening is the backup runs in two seconds and I get a couple of files with compressed sub-files inside of it. Remote destinations also don't work
I use your plugin on several sites. But at one of the sites backups are not uploaded to the cloud.
On all sites, I use the same data to connect to the AWS S3.
My log: https://1drv.ms/u/s!AgmGxdO2gZN2gaNJ4sC_leXje_bIUg
Wanted to check on Craft 4 support on this plugin?
Automated backup to S3 bucket via cron job never finishes, always has status: Running.
Error video: https://drive.google.com/file/d/1bWTL8xDd_MmQOTgYeRTUW2FMuxuJ7Loy/view?usp=sharing
Error screenshot:
After install, I hit the new backup button and nothing updates in the main management menu. The bottom left status indicator "Creating backup" just infinitely spins.
I checked craft error logs but did not see anything that looked relevant. My local exhibited the same behavior.
Let's add the ability to create Backup Files plans, set one by default and override default plan via webhook passing a Backup File plan handle.
Notify
I have a government client that wants us to run backups to their server. That requires specific FTP settings.
Standard FTP doesn't work and requires the FTP with TSL/SSL setting through port 21.
Is this a possible feature to add?
Add the web folder as an option to backup.
It would be super helpful for this to be an environment variable so we don't need to change the URL when we move from pre-production to a live site.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.