AWS, Cloud Computing

4 Mins Read

How to Backup CCTV IP Camera Data in AWS S3?

Overview

Closed-circuit television, or video surveillance, is the abbreviation for CCTV. In contrast to “normal” television, which is broadcast to the public, “closed-circuit” television is broadcast to a small (closed) number of monitors. CCTV networks are widely employed to identify and discourage criminal activity and to record traffic violations, but they may also be used for other purposes.

Nowadays, CCTV cameras are used for security purposes at shopping malls, airports, roadways, public transportation, and homes, among other places. CCTV assists in capturing live videos and storing them on devices or, if necessary, sending data to the cloud.

Most CCTV camera data is saved locally and storing the recorded films on the device consumes a lot of space. As a result, most CCTV providers are forced to erase captured recordings regularly. The ideal approach is to back up the captured CCTV videos to the cloud. In this blog, we will look at how to back up the captured videos into the AWS S3 with the help of the OpenCV and other libraries.

fig1

Ensure that you have created an AWS account and logged in to it. Then go to IAM Service and create an IAM user with AmazonS3FullAccess permission. Note the Access Key ID and AWS Secret Access Key when creating a user.

Go to AWS S3 and select the Create Bucket option. Enter webcam-bucket as the bucket name and leave the other choices alone, then scroll down and click Create Bucket. You can see that a bucket was created with the name webcam-bucket. If you receive an error stating that a bucket with the same name already exists, change the bucket name, and make a note of it.

How to Implement the Code Snippet?

First, import the necessary libraries as mentioned below.

# Python Code for uploading Webcam videos into the AWS S3
import cv2
from datetime import datetime
import pytz
import time
import boto3
import os
from threading import Thread

Python code is divided into two sections. The first step is to capture frames and produce a video file, and the second step is to publish the recorded video file to AWS S3. The following function assists in uploading the file to AWS S3. Replace the aws_access_key_id and aws_secret_access_key with the values you noticed before, and specify the bucket name in the variable bucket_name.

#function for uploading the file into S3
def upload_to_s3():
    ct = boto3.client(‘s3’,
                      aws_access_key_id=’XXXXXXXXXXXXXXXXXXX’,
                      aws_secret_access_key=’XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX’)
    sec1 = int(round(time.time()))#start the timer for uploading the files into S3
    while 1:
        if int(round(time.time())) >=sec1+5:
            sec1 = int(round(time.time()))#start the timer for uploading the files into S3
            print(“file going to upload”)
            for file in os.listdir():
                if ‘.avi’ in file and file_name1 != file:
                    bucket_name = ‘webcam-bucket’ #mention bucket name which you created in S3
                    path = ‘webcammer/’#mention path where the files need to be upload
                    upload_file_key = path + str(file)
                    ct.upload_file(file, bucket_name, upload_file_key)
                    print(“file uploaded successfully…” + str(file))
                    os.remove(file)
            print(“file upload done”)
Thread(target = upload_to_s3).start()

The code below will generate live stream video files from the webcam, with the file name containing the date and time. For example, 22-01-2022 11.50.50.avi It makes it much easier to discover the records.
#capturing frames and make video
print(“libraries are imported”)
tz = pytz.timezone(‘Asia/Kolkata’) #reading time zone
datetime_india_tz = datetime.now(tz)
datetime_india = datetime_india_tz.replace(microsecond=0).replace(tzinfo=None)
print(datetime_india)
file_name=str(datetime_india)+”.avi” #naming the file based on date and current time
file_name1=file_name.replace(“:”,”.”)
print(file_name1)
# Display the image
# This will return video from the first webcam on your computer.
cap = cv2.VideoCapture(0)

# Define the codec and create VideoWriter object
fourcc = cv2.VideoWriter_fourcc(*’XVID’)
out = cv2.VideoWriter(file_name1, fourcc, 10, (160, 120))
sec = int(round(time.time())) #start the timer for recordoing video every 10sec

# loop runs if capturing has been initialized.
while (1):
    # reads frames from a camera
    # ret checks return at each frame
    ret, frame = cap.read()
    # Put current DateTime on each frame
    font = cv2.FONT_HERSHEY_SIMPLEX
    cv2.putText(frame, str(datetime.now()), (10, 30), font, 1, (255, 255, 255), 2, cv2.LINE_AA)
    # resizing frame
    b = cv2.resize(frame, (160, 120))
    out.write(b)
    # The original input frame is shown in the window
    cv2.imshow(‘Original’, frame)
    if int(round(time.time())) >=sec+10:
        sec = int(round(time.time()))#reset the timer for recordoing video every 10sec
        file_name = str(datetime.now().replace(microsecond=0))+”.avi” #reading date and time
        file_name1 = file_name.replace(“:”, “.”)#namming the file based on date and current time
        print(file_name1)
        # Define the codec and create VideoWriter object
        fourcc = cv2.VideoWriter_fourcc(*’XVID’)
        out = cv2.VideoWriter(file_name1, fourcc, 5, (160, 120))
        print(“current file name”+str(file_name1))
    # Wait for ‘a’ key to stop the program
    if cv2.waitKey(1) & 0xFF == ord(‘a’):
        break

#Close the window / Release webcam
cap.release()
#After we release our webcam, we also release the output
out.release()
#De-allocate any associated memory usage
cv2.destroyAllWindows()

  • Cloud Migration
  • Devops
  • AIML & IoT
Know More

Conclusion

If you execute the code above, the video will be captured, and the files will be transferred to AWS S3. Navigate to the created bucket in the AWS S3 interface and choose the webcammer folder to examine the uploaded video files.

About CloudThat

CloudThat is also the official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner and Microsoft gold partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best in industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.

Drop a query if you have any questions regarding AWS S3 and I will get back to you quickly.

To get started, go through our Consultancy page and Managed Services Package that is CloudThat’s offerings.

Get your new hires billable within 1-60 days. Experience our Capability Development Framework today.

  • Cloud Training
  • Customized Training
  • Experiential Learning
Read More

WRITTEN BY Vasanth Kumar R

Vasanth Kumar R works as a Sr. Research Associate at CloudThat. He is highly focused and passionate about learning new cutting-edge technologies including Cloud Computing, AI/ML & IoT/IIOT. He has experience with AWS and Azure Cloud Services, Embedded Software, and IoT/IIOT Development, and also worked with various sensors and actuators as well as electrical panels for Greenhouse Automation.

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!