AWS, Cloud Computing, DevOps

5 Mins Read

Automated Export of AWS Security Hub Findings to a Central Log Archive Using Terraform

Voiced by Amazon Polly

Overview

In modern cloud security operations, it’s important to archive and analyse AWS Security Hub findings for compliance, auditing, and threat hunting.
This blog explains how to automate the export of AWS Security Hub findings to an Amazon S3 bucket using serverless components (AWS Lambda, AWS Step Functions, Amazon EventBridge), fully deployed via Terraform.

Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

Introduction

AWS Security Hub provides a comprehensive view of your security alerts and compliance status across AWS accounts. However, AWS Security Hub retains findings for only 90 days by default. Organizations need a scalable, automated export mechanism to ensure long-term storage, external analysis, or integration with SIEM tools.

In this solution, we use:

  • AWS Lambda (to fetch and upload findings),
  • AWS Step Functions (to orchestrate the workflow), and
  • Event Bridge Scheduler (to automate periodic execution).
  • Amazon S3 bucket (Destination account) (to store the findings).

Why We Are Doing This

  • Retention: AWS Security Hub findings are only kept for 90 days, after that, you lose critical security data unless you export them.
  • Compliance and Audit: Organizations often need to store security records for 1–7 years for regulatory compliance (e.g., PCI-DSS, HIPAA, GDPR).
  • Cost Optimization: Amazon S3 offers inexpensive long-term storage compared to database or SIEM systems.
  • Custom Analysis: Exported findings can be queried using Amazon Athena or processed by external SIEMs (Splunk, QRadar, etc.).

Prerequisites

Before deploying this solution, you must have:

  • Terraform installed locally (>=1.0.0 version recommended)
  • AWS CLI configured with appropriate access
  • An Amazon S3 bucket (destination for findings export)
  • AWS KMS Key (if you want to encrypt Amazon S3 bucket objects — recommended)
  • AWS Security Hub is enabled in the AWS region
  • (Optional) Cross-account permissions if the Amazon S3 bucket is in another AWS account

Solution Overview

Architecture

AWS EventBridge Scheduler -> AWS Step Functions -> AWS Lambda -> Amazon S3 bucket (Destination Account)

  • AWS EventBridge Scheduler triggers the AWS Step Function every 24 hours (or your chosen frequency).
  • AWS Step Function invokes the AWS Lambda Function to fetch findings from AWS Security Hub and store them in an Amazon S3 bucket.
  • Findings are stored in a structured CSV format, ready for further analysis.

Terraform Checklist

  1. Understand the Use Case
  • Define the objective (e.g., export findings from AWS Security Hub to a cross-account Amazon S3 bucket).
  • Identify services involved:
    • AWS Security Hub
    • AWS Lambda
    • Amazon S3 bucket (Destination)
    • AWS KMS (Destination)
    • AWS Step Functions
    • Amazon EventBridge Scheduler
    • AWS IAM
    • SSM Parameter Store
  1. Design the Architecture
  • Map out the workflow:
    • AWS Lambda -> Amazon S3 bucket + AWS KMS
    • AWS Step Functions for pagination
    • Scheduler for automation
  • Clarify cross-account boundaries:
    • Source: AWS Lambda, AWS Step Functions, Scheduler
    • Destination: Amazon S3 bucket, AWS KMS key
  1. Define AWS IAM Requirements
  • AWS Lambda Role:
    • securityhub:GetFindings
    • s3: PutObject, s3: GetObject
    • kms:Decrypt, kms:GenerateDataKey
    • ssm:GetParameter, PutParameter, DeleteParameter
  • AWS Step Functions Role:
    • AWS Lambda:InvokeFunction
  • Scheduler Role:
    • states:StartExecution
  • Cross-account permissions:
    • Destination Amazon S3 bucket policy
    • AWS KMS key policy
  1. Write Terraform Code
  • Providers
  • Source account
  • AWS IAM Roles and Policies
    • AWS Lambda execution role
    • AWS Step Function execution role
    • Scheduler trigger role
  • AWS Lambda Function
    • ZIP upload
    • Handler and runtime
    • Environment variables
  • AWS Step Function State Machine
    • Task → Choice → Loop → Success flow
  • Scheduler Schedule
    • Expression (e.g., rate (1 hour or 2 as per requirement))
    • Target: Step Function
  1. Test and Validate
  • Manually invoke AWS Lambda to check output
  • Execute AWS Step Function and trace pagination
  • Verify the Scheduler triggers
  • Confirm the Amazon S3 bucket’s object is written and encrypted
    1. Optimize & Organize Code
  • Use variables and locals for flexibility
  • Organize code into:
    • main.tf
    • variables.tf
    • outputs.tf
  • Use modules for AWS IAM or AWS Lambda (optional)
  • Add tags and logging for monitoring

Infrastructure Setup

Automate with Terraform Scripts (Audit Account (Source Account) and Log archive account (Destination))

Step 1: Terraform Configuration for Log Archive Account (Destination):

    • Create the necessary Terraform scripts to create the AWS KMS key and AWS IAM roles and policies for cross-account access.
  • Create a Terraform Code and Policy for the AWS KMS key.

step1

  • Write a Terraform Configuration for the Log Archive Account, creating an Amazon S3 bucket and Policy to store the findings:  

step1b 

Step 2: Creating the AWS IAM policy and the Cross Account role of AWS Lambda to Access.

AWS Security Hub and the Amazon S3 bucket of the Destination Account, and Write Python code for AWS Lambda Function.

step2

step2b

Step 3: Write a Terraform Block for AWS Step Functions and Roles (Creating AWS Step Functions and Role)

step3

step3b

Step 4: Write a Terraform Code for AWS Event Bridge Scheduler and Roles to trigger the AWS Step Functions.

step4

Step 5: Testing and Verification

  1. Triggering the AWS Event Bridge, which triggers the AWS Step Functions and then the AWS Step Function, which invokes the AWS Lambda

step5

2. Verify Findings in the Log Archive Amazon S3 bucket:

    • Go to the Amazon S3 bucket in the log archive account and ensure the AWS Security Hub findings are stored correctly.

step5b

Notes

  • AWS Lambda Payload: lambda_function.zip must have your Python script, which should be in the same folder, or provide the desired path.
  • State Machine: Retries automatically when nextToken is present (for pagination).
  • Amazon EventBridge: Runs once every 24 hours to invoke AWS Step Function. Change according to your requirement.
  • Cross Account Amazon S3 bucket: Proper bucket policy and AWS KMS key access should be configured.

Key Benefits

  1. Serverless: No infrastructure to manage.
  2. Scalable: Handles large volumes of findings.
  3. Secure: AWS KMS encryption and fine-grained AWS IAM.
  4. Automated: Fully scheduled export with error handling.
  5. Flexible: Supports cross-region, filtering, and future customization.

Common Use Cases

table

Conclusion

This Terraform-powered, serverless solution is scalable for exporting AWS Security Hub findings to Amazon S3. By leveraging AWS Lambda, AWS Step Functions, AWS EventBridge, and AWS KMS, organizations can maintain long-term security data for compliance, audit readiness, and analytics. Furthermore, it empowers teams with automation, reduces manual effort, and allows seamless integration with SIEM tools and data lakes.

This approach ensures that security posture management remains proactive, auditable, and cost-effective, all while embracing Infrastructure as Code (IaC) best practices.

Drop a query if you have any questions regarding AWS Security Hub and we will get back to you quickly.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is a leading provider of Cloud Training and Consulting services with a global presence in India, the USA, Asia, Europe, and Africa. Specializing in AWS, Microsoft Azure, GCP, VMware, Databricks, and more, the company serves mid-market and enterprise clients, offering comprehensive expertise in Cloud Migration, Data Platforms, DevOps, IoT, AI/ML, and more.

CloudThat is the first Indian Company to win the prestigious Microsoft Partner 2024 Award and is recognized as a top-tier partner with AWS and Microsoft, including the prestigious ‘Think Big’ partner award from AWS and the Microsoft Superstars FY 2023 award in Asia & India. Having trained 650k+ professionals in 500+ cloud certifications and completed 300+ consulting projects globally, CloudThat is an official AWS Advanced Consulting Partner, Microsoft Gold Partner, AWS Training PartnerAWS Migration PartnerAWS Data and Analytics PartnerAWS DevOps Competency PartnerAWS GenAI Competency PartnerAmazon QuickSight Service Delivery PartnerAmazon EKS Service Delivery Partner AWS Microsoft Workload PartnersAmazon EC2 Service Delivery PartnerAmazon ECS Service Delivery PartnerAWS Glue Service Delivery PartnerAmazon Redshift Service Delivery PartnerAWS Control Tower Service Delivery PartnerAWS WAF Service Delivery PartnerAmazon CloudFront Service Delivery PartnerAmazon OpenSearch Service Delivery PartnerAWS DMS Service Delivery PartnerAWS Systems Manager Service Delivery PartnerAmazon RDS Service Delivery PartnerAWS CloudFormation Service Delivery Partner and many more.

FAQs

1. Can I modify the export frequency from 24 hours to something else?

ANS: – Yes, you can change the Amazon Event Bridge rule’s schedule expression to any valid cron or rate expression (e.g., rate(1 hour) or cron(0 12 * * ? *)), depending on your operational needs.

2. What if the size of the findings exceeds AWS Lambda's response limit?

ANS: – The solution uses AWS Step Functions to handle pagination via the NextToken mechanism. Findings are split and written in chunks to Amazon S3. You can also enhance it by saving intermediate states in AWS Systems Manager Parameter Store to resume across retries or executions.

WRITTEN BY Pradeep Naik

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!