Some Security Practices to follow while using AWS Resources

Riya John
3 min readAug 1, 2020
Secure your system on Cloud

We will cover some important AWS Cloud security practices which we need to know and enforce on our AWS resources. It will cover configurations that we usually tend to skip, but are really important to protect our data and to have a secure environment on the cloud.

S3 Bucket

Enable server-side encryption on the bucket by enabling default encryption in the bucket properties, by choosing AES-256 for keys managed by Amazon S3 or choosing AWS-KMS to pass keys created by you(Remediation steps in detail).

Limit S3 traffic to only HTTPS requests, by explicitly denying access to http requests in the bucket policy

{
"Sid": "DenyInsecureConnections",
"Action": "s3:*",
"Effect": "Deny",
"Resource": [
"arn:aws:s3:::mybucket",
"arn:aws:s3:::mybucket/*"
],
"Condition":
{
"Bool":
{
"aws:SecureTransport": "false"
}
},
"Principal": "*"
}

EC2

EBS Snapshots should not be public.

Create EBS Volume with encryption enabled. Note that you cannot directly encrypt an existing unencrypted EBS volume or EBS snapshot, you will have to create a new volume.

Prohibit inbound and outbound traffic on default security group of a VPC. If someone creates an instance in the VPC without assigning a security group, then it is automatically assigned the default security group, which could lead to public exposure of the associated resources.

Remember to create an instance profile when you create an IAM role for your EC2 instance via AWS CLI or AWS API(steps to follow to attach it), unlike via console, in which case console automatically creates it with the same name as the role.

Kinesis

Enable server-side encryption for your kinesis stream, to start encrypting your incoming data written to stream. On cost front, note that though SSE is a free kinesis feature, AWS KMS usage costs will apply.

RDS

DB instances should prohibit public access.

Enable encryption of DB instances, to encrypt your data, including all logs, backups and snapshots of the instance.

Set snapshot visibility as private.

Redshift

Encrypt all data in your cluster by using AWS KMS or a hardware security module(HSM) for key management. Though infrequently used, using a HSM device will add greater security since we now have direct control of key generation and is separating key management from the application and database layers. You can easily enable encryption using AWS KMS on an unencrypted cluster, Amazon Redshift will automatically migrate your data to a new encrypted cluster(more details).

Elasticsearch Service

Create an Amazon ES domain with VPC access. Note that a domain created with public access, cannot be placed in a VPC later. Its data has to be migrated to a new domain — details to migrate from public access to VPC access.

CodeBuild

Connect using OAuth if you are using GitHub or Bitbucket as the source.

DynamoDB

By default, all data in DynamoDB tables are encrypted at rest — no action required.

In case you want to encrypt your data in transit you can consider using Amazon’s DynamoDB Encryption Client — a client side encryption library to encrypt data before sending to DynamoDB, for full lifecycle protection.

This is only a small list of best practices that we do not know or usually tend to ignore while using AWS resources that can make our system on cloud more secure. I hope this is useful.

Please let me know by clicking on the clap👏 button, if this post was helpful.

--

--