... | ... | @@ -72,6 +72,8 @@ The `/path/to/backup/location` is a mount to a container volume the is accessibl |
|
|
A more professional version that uses S3 is this script:
|
|
|
|
|
|
```bash
|
|
|
set -e
|
|
|
|
|
|
if [ $# -ne 4 ]; then
|
|
|
echo "Usage: $0 <CONTAINER> <USER> <TARGETDIR> <PASSWORD>"
|
|
|
exit 1
|
... | ... | @@ -86,7 +88,9 @@ export NAME=${CONTAINER}_$(date +'%Y%m%d') |
|
|
export DBFILE=$TARGETDIR/$NAME.db
|
|
|
export LOGFILE=$TARGETDIR/$NAME.log
|
|
|
export ZIPFILE=$TARGETDIR/$NAME.zip
|
|
|
AWS_OPTS="--endpoint-url https://eu2.contabostorage.com"
|
|
|
|
|
|
AWS_OPTS=""
|
|
|
#AWS_OPTS="--endpoint-url https://eu2.contabostorage.com"
|
|
|
|
|
|
#docker ps > $TARGETDIR/containers.txt
|
|
|
docker exec $CONTAINER vacuumdb -U $USER --all >$LOGFILE
|
... | ... | @@ -96,9 +100,10 @@ docker exec $CONTAINER pg_dump -U $USER -Fc >$DBFILE 2>>$LOGFILE |
|
|
|
|
|
echo Zipping into $ZIPFILE...
|
|
|
zip -e -P $PW $ZIPFILE $LOGFILE $DBFILE
|
|
|
rm $LOGFILE $DBFILE
|
|
|
|
|
|
echo Upload to S3...
|
|
|
aws $AWS_OPTS s3 cp $ZIPFILE s3://backups/$CONTAINER/$NAME.zip
|
|
|
aws $AWS_OPTS s3 cp $ZIPFILE s3://xlrit-backups/$CONTAINER/$NAME.zip
|
|
|
```
|
|
|
|
|
|
Note that S3 requires credentials that can be stored in a file called `~/.aws/credentials/`. Here is an example content of that file (with fake keys of course):
|
... | ... | @@ -117,20 +122,26 @@ The restore can be done with command `pg_restore`. For instance |
|
|
pg_restore dewilde_backup_Monday
|
|
|
```
|
|
|
|
|
|
A more professional version of this is the following script that will get the backup from S3 cloud storage based on a date. But first:
|
|
|
- adjust exported variables first and
|
|
|
- stop the application before running this script.
|
|
|
A more professional version of this is the following script that will get the backup from S3 cloud storage based on a date.
|
|
|
|
|
|
But first stop (just) the application before running this script for instance with `docker stop <name of application container>`
|
|
|
|
|
|
```bash
|
|
|
echo Example usage: $0 acc local 20240122
|
|
|
set -e
|
|
|
echo Example usage: $0 acc test 20240122
|
|
|
[ "$#" -ne 3 ] && echo "Usage: $0 <SOURCE_ENV> <TARGET_ENV> <DATE>" && exit 1
|
|
|
|
|
|
MSYS_NO_PATHCONV=1 # in Git Bash do not convert paths to Windows equivalents
|
|
|
SOURCE_ENV=$1 # e.g. acc
|
|
|
TARGET_ENV=$2 # e.g. tst
|
|
|
DATE=$3 # e.g. 20240122
|
|
|
AWS_OPTS="--endpoint-url https://eu2.contabostorage.com"
|
|
|
|
|
|
AWS_OPTS=""
|
|
|
#AWS_OPTS="--endpoint-url https://eu2.contabostorage.com"
|
|
|
|
|
|
mkdir -p /tmp/backup-restore
|
|
|
cd /tmp/backup-restore
|
|
|
aws $AWS_OPTS s3 cp s3://backups/wyatt_${SOURCE_ENV}_db/wyatt_${SOURCE_ENV}_db_${DATE}.zip .
|
|
|
aws $AWS_OPTS s3 cp s3://xlrit-backups/wyatt_${SOURCE_ENV}_db/wyatt_${SOURCE_ENV}_db_${DATE}.zip .
|
|
|
unzip wyatt_${SOURCE_ENV}_db_${DATE}.zip # enter password
|
|
|
docker cp backups/wyatt_${SOURCE_ENV}_db_${DATE}.db wyatt_${TARGET_ENV}_db:/tmp/wyatt.db
|
|
|
docker exec -t wyatt_${TARGET_ENV}_db psql -U wyatt -d wyatt -c "DROP SCHEMA IF EXISTS public CASCADE;"
|
... | ... | |