Manage S3 Buckets And Objects Using AWS CLI

In this article , We will learn how to manage S3 Buckets and Objects using Command line Interface , AWS CLI.

I have published an article on , How to Install and Configure AWS CLI on Ubuntu

Using AWS CLI , You can easily manage S3 Buckets and the objects in the Buckets.

Before proceeding with the below tutorial , Make sure you have full S3 permissions.

1.Create S3 Bucket

  • We have to use s3 mb option with aws command to create a new S3 bucket.
  • The name for the s3 bucket should be unique globally.
  • mb stands for make bucket. And the bucket will be created as per the region specified during the aws cli configuration. You can check the same under User’s Home directory (~/.aws/config)
  • Bucket name can contain lowercase letters , hyphens , numbers and periods. But It should not end with hyphens and hyphens.
aws s3 mb s3://bucketname

If the bucket name already exists and owned by some other users , You will get the below error.

make_bucket failed: s3://testbucket An error occurred (BucketAlreadyExists) when calling the CreateBucket operation: The requested bucket name is not available. The bucket namespace is shared by all users of the system. Please select a different name and try again.

To Create S3 bucket in a specific AWS region , Other than the region specified in AWS CLI , Use the –region option.

aws s3 mb s3://testbucket0072543 --region ap-south-1

Example:

rahulk@RahulK:~$ aws s3 mb s3://testbucket0072543

make_bucket: testbucket0072543

2.Listing Buckets And Objects

Using the s3 ls command we can list the buckets and the Objects in the S3 bucket.

The Below command will list all the buckets.

aws s3 ls 

To lists all the objects in a particular bucket , You can use the below command.

aws s3 ls s3://bucket-name

Lets say , You want to list the objects in the folder within the S3 bucket.Use the below command.

In the terms of S3 , Folder is referred as prefix.

aws s3 ls s3://bucket-name/foldername/ 

3.Deleting S3 Buckets And Objects

If you want to delete a S3 bucket which is empty , You can use the below command.

You should use s3 rb with the aws command. rb stands for Remove bucket.

aws s3 rb s3://bucket-name

If you try to delete an S3 bucket with objects , You will get an error.

remove_bucket failed: s3://testbucket0072543 An error occurred (BucketNotEmpty) when calling the DeleteBucket operation: The bucket you tried to delete is not empty

if you want to delete the S3 bucket with the objects inside it. You should use

–force with the s3 rb command as shown below.

aws s3 rb s3://bucket-name --force

Example:

rahulk@RahulK:~$ aws s3 rb s3://testbucket0072543 --force
delete: s3://testbucket0072543/tmt.sh
remove_bucket: testbucket0072543

4.Copy Files From Local To S3 Bucket

To copy the files from the local machine to S3 bucket , Use s3 cp command.

In the below example , We will copy a file named sample.txt from the Local machine to S3 bucket using the Below command.

aws s3 cp sample.txt s3://bucket-name/

If you want to copy file , But with the different file name in the S3 bucket , Use the below command.

aws s3 cp sample.txt s3://bucket-name/filename

Examples:

aws s3 cp sample.txt s3://testbucketmumforawscli/

upload: ./sample.txt to s3://testbucketmumforawscli/sample.txt

From the below example , We can see that the contents of the file sample.txt (From Local) is copied to the S3 bucket with the different file name.

aws s3 cp sample.txt s3://testbucketmumforawscli/sample2.txt

upload: ./sample.txt to s3://testbucketmumforawscli/sample2.txt

Instead of going to the Exact folder of files in the Local machine , You can specify the full path of the files as shown below.

aws s3 cp /home/rahulk/testfiles/sample.txt s3://bucket-name/

5.Copy Files Between S3 Buckets

Lets say you have two S3 buckets and you want to copy all the files in Bucket 1 to Bucket 2.

Using the s3 cp command.

aws s3 cp s3://bucket-name/filename s3://bucket-name/

Example:

aws s3 cp s3://testbucketmumforawscli/sample.txt s3://testbucketmumforawscli2/

copy: s3://testbucketmumforawscli/sample.txt to s3://testbucketmumforawscli2/sample.txt

In the above example We have copied files in the / (root level) of the S3 bucket.

Let’s say you have a file inside a folder , For example s3://bucketname/foldername/filename , In this case If you copy the file from Source to destination bucket ,

The folder won’t be created in the destination bucket , Instead the file will be copied to the / root level of the S3 bucket as shown below.

aws s3 cp s3://testbucketmumforawscli/samplefolder/sample.txt s3://testbucketmumforawscli2/
copy: s3://testbucketmumforawscli/samplefolder/sample.txt to s3://testbucketmumforawscli2/sample.txt

If you want to copy files along with the folder , You should use the below command.

aws s3 cp s3://testbucketmumforawscli/samplefolder/sample.txt s3://testbucketmumforawscli2/samplefolder/
copy: s3://testbucketmumforawscli/samplefolder/sample.txt to s3://testbucketmumforawscli2/samplefolder/sample.txt

6.Copy Files From S3 Bucket To Local

We should use s3 cp command to copy files from the S3 Bucket.

aws s3 cp s3://bucket-name/filename .

.represents the current directory of the Local machine , Alternatively You can mention the path as shown below.

aws s3 cp s3://bucket-name/filename /home/rahulk/bucketfiles/

Example:

aws s3 cp s3://testbucketmumforawscli/samplefolder/sample.txt /home/rahulk/Downloads/

 download: s3://testbucketmumforawscli/samplefolder/sample.txt to Downloads/sample.txt

To Download all the files , folders and sub-folders from the S3 bucket , Use –recursive

aws s3 cp s3://bucketname/ . --recursive
aws s3 cp s3://bucketname/ /home/rahulk/backup/ --recursive

While Copying, If the folder is not present in the local system , It will automatically create during the backup process and backup all the files over there.

Conclusion

In this article , We have learnt few commands to manage S3 buckets and Objects in the S3 buckets.And also copying files between Local and S3 bucket.

Hope you found it helpful. Thanks for reading this article.

Please do check out my other publications.