Blog

Bash Shell Script to backup RDS/EC2 PostgreSQL DB and upload to S3 weekly

#!/bin/bash
# Run as sudo. for weekly backup of db. and upload to s3 bucket.
DBHOME="/home/priyank/crontabs/dbbackups/"
BUCKETNAME="yourAWSbucket"
SCRIPTNAME="$(basename $BASH_SOURCE)"
SCRIPTFULLPATH="$(pwd)/$(basename $BASH_SOURCE)"
mkdir -p $DBHOME
chown -R postgres:postgres $DBHOME
cp $SCRIPTFULLPATH $DBHOME
SCHEMA_BACKUP="$DBHOME/$(date +%w).sql"
sudo -u postgres touch $SCHEMA_BACKUP
sudo -u ubuntu echo "" > $SCHEMA_BACKUP
sudo -u postgres PGPASSWORD="yourPGpassword" pg_dump -h localhost -p 5432 -U postgres -F p -b -v --column-inserts --data-only -f $SCHEMA_BACKUP "yourDBname"
CRONPATH="$DBHOME$SCRIPTNAME"
chmod +x $CRONPATH
FLAGCHK=0
crontab -l | grep -q "$SCRIPTNAME" && FLAGCHK=1 || (crontab -l | { cat; echo "00 23 * * * $CRONPATH"; } | crontab -)
if [ $FLAGCHK -eq 0 ]
then
apt-get install s3cmd
s3cmd --configure
fi
s3cmd put $SCHEMA_BACKUP "s3://$BUCKETNAME/dbbackups/"

Bash Script to backup RDS/EC2 MySQL DB and upload to S3 weekly

You may come across task to write cronjob that takes backup of db every day/week/month and upload to aws s3. 
Here is shell script to do that job. make sure to replace bucket name, credentials with yours.

#!/bin/bash
# Run as sudo. for weekly backup of db. and upload to s3 bucket.
DBHOME="/home/ubuntu/priyank/crontabs/dbbackups/"
BUCKETNAME="yourAWSbucket"
SCRIPTNAME="$(basename $BASH_SOURCE)"
SCRIPTFULLPATH="$(pwd)/$(basename $BASH_SOURCE)"
mkdir -p $DBHOME
chown -R ubuntu:ubuntu $DBHOME
cp $SCRIPTFULLPATH $DBHOME
SCHEMA_BACKUP="$DBHOME/$(date +%w).gzip"
sudo -u ubuntu touch $SCHEMA_BACKUP
sudo -u ubuntu echo "" > $SCHEMA_BACKUP
sudo -u ubuntu mysqldump -P <yourDBport> -h <yourDBHost> -u <yourDBUser> -p<yourDBpassword> --force --opt --databases <yourDBName> | gzip -c > $SCHEMA_BACKUP
CRONPATH="$DBHOME$SCRIPTNAME"
chmod +x $CRONPATH
FLAGCHK=0
crontab -l | grep -q "$SCRIPTNAME" && FLAGCHK=1 || (crontab -l | { cat; echo "00 23 * * * $CRONPATH"; } | crontab -)
if [ $FLAGCHK -eq 0 ]
then
apt-get install s3cmd
s3cmd --configure
fi
s3cmd put $SCHEMA_BACKUP "s3://$BUCKETNAME/dbbackups/"

Aurora Triggers to Call AWS lambda function

Recently I needed to call my lambda function when CRUD happen on my aurora db table. AWS Aurora supports accesing AWS services. 
So If you want integrate such architecture , you can follow following step by step guide to make it work. 

1) Create RDS & Lambda full access Role with principal as "rds.amazonaws.com". ( arn:aws:iam::<account_id>:role/RDS-Lambda-Access )
2) Edit aurora parameter group and assign ARN of 1)
3) Edit aurora Clustor and also from `Managed IAM Roles` assign role created in 1).
4) Rebooted aurora instance.

AWS Lambda in VPC having access to AWS resources as well as internet with IGW , Internet Gateway , NAT and Route Tables

If Your lambda function didn't required to call third party services like firebase or payment gateways etc, you can configure it to use default aws vpc which don't have internet access.

But If you required to have internet access as well in VPC , you need to set up NAT (network address translation gateway) , IGW ( internet gateway) , Route Tables with their subnets as attachments.

Note : Some of these services are chargeable .

Here is step by step guide to setup lambda for second case.

We going to have total 3 Subnets. 2 Private subnet and 1 public subnet.

In Lambda Function > Configuration > VPC > Subnets  we going to have only two private subnets selected. public subnet won't be selected here. 

1) Lets say your default vpc has private ip address as 192.168.0.0/16. or create new one.

2) Create Private Subnets :
    Go to AWS VPC > Subnets > Create two private subnets with CIDR as 192.168.20.0/24 and 192.168.30.0/24

Docker based Python Project Deployment with NginX, Supervisor, uwsgi on Ubuntu

For Docker installation refer : https://docs.docker.com/installation/ubuntulinux/