EC2 inventory from different AWS Accounts in a CSV file

0 120

1. Create lambda function as follows

    1. Go to lambda from services menu of AWS
    2. Click on create function button from lambda dashboard
    3. Choose Author from scratch option while creating function
    4. Give function name & select Runtime language as python 3.7
    5. Assign S3 permission to read file from s3 as soon as uploaded
      1. Select Create a role with AWS policy template
      2. Define Role name
      3. Choose Policy Template
    6. Assign S3 Write permission (AmazonS3FullAccess) to store CSV file in s3 bucket.
    7. Now click on button Create function

2. Next you will have to configure the lambda function

    1. Add trigger (if required).
    2. Copy the function Code from here

import boto3, os
from datetime import date
def lambda_handler(event, context):
ec2 = boto3.resource(‘ec2′,’us-east-1’)

#account_details = [(‘<acc_id>’,'<acc_sec_id>’,'<acc_nm>’),(‘<acc_id>’,'<acc_sec_id>’,'<acc_nm>’)]
account_details = [(”,”,”),(”,”,”)] #give account credential to check resources

#regions = [‘<region_code>’,'<region_code>’]
regions = [‘us-east-1′,’ap-south-1’] #give regions to check ec2 instances

#tag_keys = [‘<tag_key>’,'<tag_key>’,'<tag_key>’]
tag_keys = [”,”,”] #Tagkeys you want to search

today=date.today()
file_name = ‘ec2_instance_report_’ +today.strftime(“%d-%m-%y”)+’.csv’

s3_bucket = ” #s3 bucket to store csv file

file_location = ‘/tmp/’+file_name
with open(file_location,’w’) as file:
file.write(‘account_id,account_name,region,instance_id,instance_name,instance_type,platform,state,vpc,subnet,private_ip,’ + ‘,’.join(tag_keys) + ‘\n’)

for acc_tuple in account_details:
boto_session = boto3.session.Session(aws_access_key_id=acc_tuple[0], aws_secret_access_key=acc_tuple[1])
for region in regions:
ec2 = boto_session.resource(‘ec2’,region)
instances = ec2.instances.all()
for index, instance in enumerate(instances):
tags_list = instance.tags
instance_name = ”
output_tags = {}
for tag in tags_list:
if tag[‘Key’] == ‘Name’ :
instance_name = tag[‘Value’]
elif tag[‘Key’] in tag_keys:
output_tags[tag[‘Key’]] = tag[‘Value’]

client = boto3.client(“sts”, aws_access_key_id=acc_tuple[0], aws_secret_access_key=acc_tuple[1])
output = client.get_caller_identity()[‘Account’] + ‘,’ + acc_tuple[2] + ‘,’ + region + ‘,’ + instance.id + ‘,’ + instance_name + ‘,’ + instance.instance_type + ‘,’ + str(instance.platform) + ‘,’ + instance.state [‘Name’]+ ‘,’ + instance.vpc_id + ‘,’ + instance.subnet_id + ‘,’ + instance.private_ip_address

for tag_given_key in tag_keys:
if tag_given_key in output_tags.keys():
output += ‘,’ + output_tags[tag_given_key]
else:
output += ‘,’ + ”

with open(file_location,’a’) as file:
file.write(output + ‘\n’)
s3 = boto3.client(‘s3’)
s3.upload_file(file_location, s3_bucket, file_name)

3. Do some changes in code as required

    1. Proper Indent Arrangements:
      1. Give single indent from line 4 – 21, 24, 50 – 51
      2. Give Double indent to line 22, 25, 26
      3. Give Triple indent to line 27-29
      4. Give Four indent to line 30-33,39-42, 48
      5. Give Five indent to line 34, 36, 43, 45, 49
      6. Give six indent to line 35, 37, 44, 46
    2. Give inputs:
      1. Account details (at line no 7)
      2. Regions (at line no 10) (separated with comma in a list)
      3. Tag_keys (at line no 13) (tags which you want to recover)
      4. S3_bucket (at line no 18)

4.Now click on SAVE to save the function code and changes done in configuration

5. Now trigger your lambda function by Calling.

6. Now After execution following is the output you can see below

7. Thank you…..

Leave A Reply

Your email address will not be published.