Project Name: AWS S3 Memory Monitoring
Table of contents
Description:
This project consists of two components: a Python script and a shell script. The Python script creates an Amazon S3 bucket using the Boto3 library and uploads a text file containing available memory information. The shell script records available memory before and after clearing cache, storing the results in a text file. The text file is then uploaded to the S3 bucket. The project is automated using crontab to run daily at 12:00 AM.
Installation:
Prerequisites:
Python installed on your system
AWS credentials configured with appropriate permissions
Boto3 library installed:
pip install boto3
Clone the Repository:
git clone cd aws-memory-monitoring
Usage:
Set Up AWS Credentials: Ensure your AWS credentials are properly configured to allow the Boto3 library to interact with your AWS account.
Creation of S3 Bucket:
create a python code for the creating s3 project
import boto3 s3=boto3.client('s3') bucket_name='my-aws-python-buckt123' s3.create_bucket(Bucket=bucket_name) print("Bucket created successfully")
Creating Shell Script to free Cache memmory
#!/bin/bash echo $time >>available-memmory.txt a=$(free -h | grep Mem | awk '{print $7}') echo "available memory before clearing cache: $a" >>available-memmory.txt sudo sync; echo 3 > /proc/sys/vm/drop_caches b=$(free -h | grep Mem | awk '{print $7}') echo "available memory after clearing cache: $b " >>available-memmory.txt
create a python script to upload file in s3 bucket
import boto3 s3=boto3.client('s3') bucket_name='my-aws-python-buckt123' file_path='/home/kus/Documents/python-aws-script/available-memmory.txt' object_name='available-memmory.txt' s3.upload_file(file_path,bucket_name,object_name) print("File uploaded successfully")
Run the Memory Monitoring Script:
./memory_monitoring.sh
This shell script will record available memory before and after clearing the cache and store the results in
available-memmory.txt
.Set Up Crontab: Edit your crontab using
crontab -e
and add the following line to automate the script daily at 12:00 AM: The first Command automate the script file The second Command automate uploading the file to s3 Bucket0 0 * * * /path/to/aws-memory-monitoring/memory_monitoring.sh 0 0 * * * python3 ./home/kus/Documents/shell-scripting-projects/aws-s3-function.py