Blog

Django Admin File field ( Widget ) for AWS Identity based S3 uploads

When you are working with AWS serverless, You probably has faced body size limit for your lambda function. Basically It won't allow file uploads beyond size limit specified. 
Here is guide if you are using Django in your website and deployed on AWS Lambda + API Gateway ( Zappa ) And want to allow file uploads of any size in Django Admin using AWS S3 Identity based uploads.

Write One API which give response as following json : 
     `/api/awsIdentity/` // any endpoint you like make sure authenticated and GET only
     {

          "IdentityId": "",
          "Token": "",
          "bucket_name": "",
          "bucket_region": "",
          "auth_role_arn": ""
     }

Your widget for File Field And Admin Form will looks like this : 

    from django import forms
    from django.contrib import admin

Copy files from one AWS S3 bucket to another with public permission

Here is what you may looking for : 

AWS Cognito setup to work with AWS s3 identity based uploads

1. In AWS S3 console:
    Set CORS as below to your bucket.
<?xml version="1.0" encoding="UTF-8"?>
  <CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
            <CORSRule>
                <AllowedOrigin>*</AllowedOrigin>
                <AllowedMethod>POST</AllowedMethod>
                <AllowedMethod>GET</AllowedMethod>
                <AllowedMethod>PUT</AllowedMethod>
                <AllowedMethod>DELETE</AllowedMethod>
                <AllowedMethod>HEAD</AllowedMethod>
                <AllowedHeader>*</AllowedHeader>
            </CORSRule>
 </CORSConfiguration>
2. In Cognito Console Set User Pool:
Manage User Pools > Custom settings > Name Pool "TestPool" > "sign themselves up?" only administrator > No verification (nor email nor phone) > App clients > Add an app client > "TestPoolApp" > check "generate client secret" > Note down pool id and ARN , App client id and app client secret.
3. In Cognito Console Set Federated Identities:
Click on "Federated Identities" > Name "Identity pool name" > In "Authentication providers" > "cognito" tab > Set details of prev step. > "Custom" tab set developer name. > Create > Edit your `Auth_Role` > Set following:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"mobileanalytics:PutEvents",
"cognito-sync:*",
"cognito-identity:*"
],
"Resource": [
"*"
]
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<yourbucketname>/${cognito-identity.amazonaws.com:sub}/*"
]
}
]
}
Note : With this user can upload files to only directory key named with its identity id.

4. Get ARN of above auth rule and also Note down identity pool id.

Now you can create api that can return identity id and token so client sdk can upload to s3 directly.

AWS Code Deploy and Pipeline with github integration guide

1. In your EC2 instance ( Ubuntu 16.04 ):
sudo apt-get update
sudo apt-get install python-pip ruby wget
cd /home/ubuntu
wget https://aws-codedeploy-<<bucket-region>>.s3.amazonaws.com/latest/install
chmod +x ./install
sudo ./install auto
sudo service codedeploy-agent start
sudo systemctl enable codedeploy-agent
2. Create Tag in your EC2.
Select Instance > Tags tab
3. IAM Roles
- Create role > From AWS Service Role > Select "AWS CodeDeploy" > Name it "codedeploy_service_role"
- Again Open that "codedeploy_service_role" and Attach policies > "AWSCodePipelineFullAccess"
4. IAM Policy
- Create Policy > "Create Your Own Policy" > Name it "CodeDeployEC2" >
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:Get*",
"s3:List*"
],
"Effect": "Allow",
"Resource": "*"
}
]
}
- Save it.
5. IAM (For EC2)
- Create Role > From AWS Service Role > Select "Amazon EC2" > From policies search above and select it > Name it "EC2CodeDeployRole"
6. EC2
- Select Instance > Action > Instance Settings > Attach Replace IAM Roles > Select "EC2CodeDeployRole"
7. appspec.yml  file in your repo root

- This file contains your before deploy (ex git clone, pull) and after deploy (running db migrations, restart server) bash scripts locations. so create them. and push to your deploy branch.


8. Code Deploy console
- Select "Custom deployment" > Provide Application Name and group name > Select "In-place deployment" > Select your EC2 tag that created earlier > Select "OneAtTime" configuration > Select "codedeploy_service_role" in Service Role > Create.
9.  Code pipeline console
- Name it > Source Location > Github > Connect it (will need org permission) and you must be owner of repo > Select repo > deploy branch > In Build provider "No build" > In Deployment provider "AWS CodeDeploy" > Select app name and deploy group name created in prev step > In AWS Service Role "Create New Role" > Create Pipeline


Hope this step by step guide helps !!

Custom PostGreSql version ( 9.6 ) pg_dump and restore from and to remote host

Backup PostgreSQL Remote DB: