s3 (1)

List,Create And Delete S3 Buckets Using Python Boto3 Script

In this blog we are going to create python script to list, create and delete S3 buckets using boto3.

  • Prerequisite.
  • Create S3 Bucket And Attach Tags.
  • List All S3 Buckets.
  • Delete S3 Bucket If No Objects Exists.

An AWS Account An IAM User with:

  • AWS Management Console access to verify your S3 buckets launched,listed and deleted.
  • The IAM permissions required to perform IAM, S3, and CloudWatch activities. IAM policy creation and AWS Application Programming Interface (API) permissions are outside this article’s scope. Always adhere to the principle of least privilege when authorizing accounts to perform actions. Administrative access to S3.
  • Install awscli using aws official documentation here
  • Install python and boto3
  • Configure aws cli by using official documentation here
  1. Lets import boto3 module
    import boto3
    
  2. We will invoke the client for S3
    client = boto3.client('s3')
    
  3. Now we will use input() to take bucket name to be create as user input and will store in variable “bucket_name“.
    Note:- Make sure to check the bucket naming rules here
    bucket_name=str(input('Please input bucket name to be created: '))
    
  4. Goto link where you will find all arguments list. Based on your requirement you can put this arguments to list your S3 buckets. This document also mentions datatype of the parameter.
    Note:-Bucket Name argument is mandatory and bucket name should be unique
    response1 = client.create_bucket(
     ACL='private',
     Bucket=bucket_name
     )
    
  5. Now we will use input() to confirm if user wants to go ahead with bucket tagging via user input and will store it in variable “tag_resp“.
    tag_resp=str(input('Press "y" if you want to tag your bucket?: '))
    
  6. Now we will use if condition and take user input for tags which needs to be defined for bucket.
    We will store tag key in variable “tag_key” and tag value in “tag_value“. To add tag to bucket we are going to use put_bucket_tagging() method, make sure to check official documentation here In method parameters we are passing variable as “bucket_name“,”tag_key“,”tag_value“.
    if tag_resp == 'y':
    tag_key=str(input("Please enter key for the tag: "))
    tag_value = str(input("Please enter value for the tag: "))
    response2 = client.put_bucket_tagging(
    Bucket=bucket_name,
    Tagging={
        'TagSet': [
            {
                'Key': tag_key,
                'Value': tag_value
            }
        ]
    })
    

To view entire github code please click here

  1. Python code in one module gains access to the code in another module by the process of importing it. The import statement combines two operations it searches for the named module, then it binds the results of that search to a name in the local scope.
    import boto3
    
  2. We will invoke the client for S3
    client = boto3.client('s3')
    
  3. Goto link where you will find all arguments list. Based on your requirement you can put this arguments to list your S3 buckets. This document also mentions datatype of the parameter.
    response = client.list_buckets()
    
  4. Once above method will run S3 information will be captured in variable “response”. It will return infomation in dictonary, so “response” would be a dictonary.
  5. Now we will traverse the dict using for loop to print list of S3 buckets.
for bucket in response['Buckets']:
    print(bucket['Name'])

To view entire github code please click here

  1. Lets import boto3 module
    import boto3
    
  2. We will invoke the client for S3
    client = boto3.client('s3')
    
  3. Now we will use input() to take bucket name to be deleted as user input and will store in variable “bucket_name“.
    bucket_name=str(input('Please input bucket name to be deleted: '))
    
  4. We will use for loop now use for loop to first check if there is any object existing in this S3 bucket.
    Note:- If any object is present in S3 bucket it wont be deleted.
    We have used list_objects_v2 in this for loop to get count of objects present in S3 bucket here
    print("Before deleting the bucket we need to check if its empty. Cheking ...")
    objects = client.list_objects_v2(Bucket=bucket_name)
    fileCount = objects['KeyCount']
    
  5. Now we will use if else condition. If no object is present in S3 bucket we will use delete_bucket() method along with bucket name as argument to delete the S3 bucket.
    Checkout official documentation for delete_bucket() method here where you will find all arguments list. Based on your requirement you can put this arguments to delete your S3 bucket. This document also mentions datatype of the parameter.
    if fileCount == 0:
     response = client.delete_bucket(Bucket=bucket_name)
     print("{} has been deleted successfully !!!".format(bucket_name))
    
  6. If there are objects present in S3 bucket it will enter the else loop and print below message.
    if fileCount == 0:
     response = client.delete_bucket(Bucket=bucket_name)
     print("{} has been deleted successfully !!!".format(bucket_name))
    else:
     print("{} is not empty {} objects present".format(bucket_name,fileCount))
     print("Please make sure S3 bucket is empty before deleting it !!!")
    

To view entire github code please click here

🥁🥁 Conclusion 🥁🥁

Boto3 provides inbuild methods for AWS resources using which many task can be automated by writing a python script. In this blog you can see we have checked how to list S3 buckets, create S3 buckets with adding tags to it and Delete S3 buckets if the bucket is empty in a simplified manner.

Stay tuned for my next blog…..

📢 Stay tuned for my next blog…..

So, did you find my content helpful? If you did or like my other content, feel free to buy me a coffee. Thanks

Dheeraj_Pic1 (2)

Author - Dheeraj Choudhary

I am an IT Professional with 11+ years of experience specializing in DevOps & Build and Release Engineering, Software configuration management in automating, build, deploy and release. I blog about AWS and DevOps on my YouTube channel, which focuses on content such as, AWS, DevOps, open source, AI-ML and AWS community activities.

RELATED ARTICLES

Comments are closed.