Voiced by Amazon Polly |
Introduction
Modern day environment needs DevOps best practices to integrate and deploy your application faster and securely on the cloud.
This blog will guide you through deploying a static website using AWS Code Pipeline. By following these steps, you will have set up an AWS CodePipeline that retrieves code from GitHub, builds it using AWS CodeBuild, and stores the artifacts in the Amazon S3 bucket for deployment.
This process enables continuous integration and deployment of your static website.
Pioneers in Cloud Consulting & Migration Services
- Reduced infrastructural costs
- Accelerated application deployment
Step-by-Step Guide
Step 1: Creating Amazon S3 Buckets
To begin, you need to create two Amazon S3 buckets. The first bucket will be used for storing artifacts, which will not be publicly accessible and will be encrypted.
The second bucket will host your website. Enable versioning for this bucket, which allows you to keep track of changes to your website over time. Make this bucket public so that website visitors can access it.
Step 2: Connecting to GitHub
Connecting AWS with GitHub allows for seamless integration and automation of code deployment, enabling continuous integration and delivery pipelines to fetch code, trigger builds, and deploy applications easily.
Step 3: Creating AWS CodeBuild
Now, it’s time to create an AWS CodePipeline. Before that, you will create a build stage, although your code is already compiled in this case. This step will demonstrate how to set up a build stage.
AWS CodeBuild:
- Enter a name for the AWS CodeBuild project.
- Choose GitHub as the source provider. Other options include Amazon S3, AWS CodeCommit, Bitbucket, etc.
- Provide a name for the AWS CodeBuild role.
- In the build configuration, specify a buildspec.yml file as the build specification. This file defines the build steps and actions.
- Enable Amazon CloudWatch logs to track the build progress and logs.
- Create the build.
The buildspec.yml file will contain the instructions for AWS CodeBuild to perform the build steps. You can define the necessary commands and actions required to build your code.
Step 4: Creating the AWS CodePipeline
In this step, you will create the AWS CodePipeline, which will automate fetching code from GitHub, building it, and deploying it to the destination Amazon S3 bucket. Here’s how to proceed:
- Give the AWS CodePipeline a suitable name that identifies its purpose.
- Create a new service role or choose an existing one with the necessary permissions to perform actions within the pipeline.
- Specify a custom location for artifacts. This is the Amazon S3 bucket you created in Step 1, where the AWS CodeBuild artifacts will be stored.
- Choose “GitHub” as the source provider and connect your GitHub account.
- Select the repository that contains your website’s code.
- Add a build stage to the pipeline and configure the build setup. This is where you will specify the AWS CodeBuild project you created in Step 2.
- Add a deploy stage to the pipeline and specify the destination Amazon S3 bucket where you want to deploy your website.
- Review the configuration of your AWS CodePipeline to ensure everything is set up correctly.
- Create the AWS CodePipeline.
Once the AWS CodePipeline is created, it will start executing the pipeline stages automatically. The pipeline will fetch the code from your GitHub repository, initiate the AWS CodeBuild project to build the code, and upload the resulting artifacts to the Amazon S3 artifact bucket. Finally, the artifacts will be deployed to the specified destination Amazon S3 bucket.
This automated process allows for efficient and consistent deployment of your website whenever changes are made to the code.
Step 5: Reviewing the code pipeline flow
The Proof of Concept (POC) flow begins by creating two Amazon S3 buckets: one for encrypted artifacts and another for hosting the website with versioning enabled. Next, AWS is connected with GitHub to facilitate seamless integration and automation of code deployment. An AWS CodeBuild project specifies GitHub as the source provider and uses a buildspec.yml file to define the build steps. Following this, an AWS CodePipeline is set up with a custom artifacts bucket, connecting to the GitHub repository, enabling webhooks for automated triggering, and configuring build and deploy stages. The pipeline is reviewed and created. Once created, the AWS CodePipeline initiates the process by fetching code from GitHub, triggering AWS CodeBuild to build the code, and uploading resulting artifacts to the artifact bucket. Finally, the artifacts are deployed to the specified destination Amazon S3 bucket, ensuring efficient and consistent website deployment whenever there are code changes.
Conclusion
This blog provides a comprehensive guide on deploying a static website using AWS CodePipeline. The process involves creating Amazon S3 buckets for artifacts and hosting, connecting to GitHub for seamless integration, setting up AWS CodeBuild with a buildspec.yml file, creating the AWS CodePipeline with build and deploy stages, and reviewing the configuration. The AWS CodePipeline automates code fetching, building with AWS CodeBuild, and deploying artifacts to the specified Amazon S3 bucket, leveraging DevOps best practices for secure and rapid application delivery in the cloud.
Drop a query if you have any questions regarding Amazon S3 Dashboard and we will get back to you quickly.
Making IT Networks Enterprise-ready – Cloud Management Services
- Accelerated cloud migration
- End-to-end view of the cloud environment
About CloudThat
CloudThat is a leading provider of Cloud Training and Consulting services with a global presence in India, the USA, Asia, Europe, and Africa. Specializing in AWS, Microsoft Azure, GCP, VMware, Databricks, and more, the company serves mid-market and enterprise clients, offering comprehensive expertise in Cloud Migration, Data Platforms, DevOps, IoT, AI/ML, and more.
CloudThat is the first Indian Company to win the prestigious Microsoft Partner 2024 Award and is recognized as a top-tier partner with AWS and Microsoft, including the prestigious ‘Think Big’ partner award from AWS and the Microsoft Superstars FY 2023 award in Asia & India. Having trained 850k+ professionals in 600+ cloud certifications and completed 500+ consulting projects globally, CloudThat is an official AWS Advanced Consulting Partner, Microsoft Gold Partner, AWS Training Partner, AWS Migration Partner, AWS Data and Analytics Partner, AWS DevOps Competency Partner, AWS GenAI Competency Partner, Amazon QuickSight Service Delivery Partner, Amazon EKS Service Delivery Partner, AWS Microsoft Workload Partners, Amazon EC2 Service Delivery Partner, Amazon ECS Service Delivery Partner, AWS Glue Service Delivery Partner, Amazon Redshift Service Delivery Partner, AWS Control Tower Service Delivery Partner, AWS WAF Service Delivery Partner, Amazon CloudFront Service Delivery Partner, Amazon OpenSearch Service Delivery Partner, AWS DMS Service Delivery Partner, AWS Systems Manager Service Delivery Partner, Amazon RDS Service Delivery Partner, AWS CloudFormation Service Delivery Partner, AWS Config, Amazon EMR and many more.
FAQs
1. Can we trigger the AWS code pipeline manually?
ANS: – Yes, you can trigger an AWS CodePipeline manually. AWS CodePipeline provides a web-based console, a CLI and API’s.
2. Can we encrypt artifacts stored in Amazon S3 buckets?
ANS: – Yes, we can encrypt the artifacts using AWS KMS (key management service).
3. What options can be used as a source for AWS CodeBuild?
ANS: – You can choose between various options such as GitHub, Amazon ECR, Local system, AWS CodeCommit, Bitbucket, and Amazon S3.
WRITTEN BY Akshay Mishra
Comments