Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
The Amazon Web Services (AWS) service log connector is available in two versions: the legacy connector for CloudTrail management and data logs, and the new version that can ingest logs from the following AWS services by pulling them from an S3 bucket (links are to AWS documentation):
- Amazon Virtual Private Cloud (VPC) - VPC Flow Logs
- Amazon GuardDuty - Findings
- AWS CloudTrail - Management and data events
- AWS CloudWatch - CloudWatch logs
This tab explains how to configure the AWS S3 connector using one of two methods:
- Automatic setup (Recommended)
- Manual setup
Prerequisites
You must have write permission on the Microsoft Sentinel workspace.
Install the Amazon Web Services solution from the Content Hub in Microsoft Sentinel. For more information, see Discover and manage Microsoft Sentinel out-of-the-box content.
Install PowerShell and the AWS CLI on your machine (for automatic setup only):
- Installation instructions for PowerShell
- Installation instructions for the AWS CLI (from AWS documentation)
Make sure that the logs from your selected AWS service use the format accepted by Microsoft Sentinel:
- Amazon VPC: .csv file in GZIP format with headers; delimiter: space.
- Amazon GuardDuty: json-line and GZIP formats.
- AWS CloudTrail: .json file in a GZIP format.
- CloudWatch: .csv file in a GZIP format without a header. If you need to convert your logs to this format, you can use this CloudWatch lambda function.
Automatic setup
To simplify the onboarding process, Microsoft Sentinel has provided a PowerShell script to automate the setup of the AWS side of the connector - the required AWS resources, credentials, and permissions.
The script:
Creates an OIDC web identity provider, to authenticate Microsoft Entra ID users to AWS. If a web identity provider already exists, the script adds Microsoft Sentinel as an audience to the existing provider.
Creates an IAM assumed role with the minimal necessary permissions, to grant OIDC-authenticated users access to your logs in a given S3 bucket and SQS queue.
Enables specified AWS services to send logs to that S3 bucket, and notification messages to that SQS queue.
If necessary, creates that S3 bucket and that SQS queue for this purpose.
Configures any necessary IAM permissions policies and applies them to the IAM role created above.
For Azure Government clouds, a specialized script creates a different OIDC web identity provider, to which it assigns the IAM assumed role.
Instructions
To run the script to set up the connector, use the following steps:
From the Microsoft Sentinel navigation menu, select Data connectors.
Select Amazon Web Services S3 from the data connectors gallery.
If you don't see the connector, install the Amazon Web Services solution from the Content Hub in Microsoft Sentinel.
In the details pane for the connector, select Open connector page.
In the Configuration section, under 1. Set up your AWS environment, expand Setup with PowerShell script (recommended).
Follow the on-screen instructions to download and extract the AWS S3 Setup Script (link downloads a zip file containing the main setup script and helper scripts) from the connector page.
Note
For ingesting AWS logs into an Azure Government cloud, download and extract this specialized AWS S3 Gov Setup Script instead.
Before running the script, run the
aws configure
command from your PowerShell command line, and enter the relevant information as prompted. See AWS Command Line Interface | Configuration basics (from AWS documentation) for details.Now run the script. Copy the command from the connector page (under "Run script to set up the environment") and paste it in your command line.
The script prompts you to enter your Workspace ID. This ID appears on the connector page. Copy it and paste it at the prompt of the script.
When the script finishes running, copy the Role ARN and the SQS URL from the script's output (see example in first screenshot below) and paste them in their respective fields in the connector page under 2. Add connection (see second screenshot below).
Select a data type from the Destination table drop-down list. This tells the connector which AWS service's logs this connection is being established to collect, and into which Log Analytics table it stores the ingested data. Then select Add connection.
Note
The script may take up to 30 minutes to finish running.
Manual setup
We recommend using the automatic setup script to deploy this connector. If for whatever reason you don't want to take advantage of this convenience, follow the steps below to set up the connector manually.
Set up your AWS environment as described in Set up your Amazon Web Services environment to collect AWS logs to Microsoft Sentinel.
In the AWS console:
Enter the Identity and Access Management (IAM) service and navigate to the list of Roles. Select the role you created above.
Copy the ARN to your clipboard.
Enter the Simple Queue Service, select the SQS queue you created, and copy the URL of the queue to your clipboard.
In Microsoft Sentinel, select Data connectors from the navigation menu.
Select Amazon Web Services S3 from the data connectors gallery.
If you don't see the connector, install the Amazon Web Services solution from the Content Hub in Microsoft Sentinel. For more information, see Discover and manage Microsoft Sentinel out-of-the-box content.
In the details pane for the connector, select Open connector page.
Under 2. Add connection:
- Paste the IAM role ARN you copied two steps ago into the Role to add field.
- Paste the URL of the SQS queue you copied in the last step into the SQS URL field.
- Select a data type from the Destination table drop-down list. This tells the connector which AWS service's logs this connection is being established to collect, and into which Log Analytics table it stores the ingested data.
- Select Add connection.
Known issues and troubleshooting
Known issues
Different types of logs can be stored in the same S3 bucket, but shouldn't be stored in the same path.
Each SQS queue should point to one type of message. If you want to ingest GuardDuty findings and VPC flow logs, set up separate queues for each type.
A single SQS queue can serve only one path in an S3 bucket. If you're storing logs in multiple paths, each path requires its own dedicated SQS queue.
Troubleshooting
Learn how to troubleshoot Amazon Web Services S3 connector issues.
Send formatted CloudWatch events to S3 using a lambda function (optional)
If your CloudWatch logs aren't in the format accepted by Microsoft Sentinel - .csv file in a GZIP format without a header - use a lambda function view the source code within AWS to send CloudWatch events to an S3 bucket in the accepted format.
The lambda function uses Python 3.9 runtime and x86_64 architecture.
To deploy the lambda function:
In the AWS Management Console, select the lambda service.
Select Create function.
Type a name for the function and select Python 3.9 as the runtime and x86_64 as the architecture.
Select Create function.
Under Choose a layer, select a layer and select Add.
Select Permissions, and under Execution role, select Role name.
Under Permissions policies, select Add permissions > Attach policies.
Search for the AmazonS3FullAccess and CloudWatchLogsReadOnlyAccess policies and attach them.
Return to the function, select Code, and paste the code link under Code source.
The default values for the parameters are set using environment variables. If necessary, you can manually adjust these values directly in the code.
Select Deploy, and then select Test.
Create an event by filling in the required fields.
Select Test to see how the event appears in the S3 bucket.
Next steps
In this document, you learned how to connect to AWS resources to ingest their logs into Microsoft Sentinel. To learn more about Microsoft Sentinel, see the following articles:
- Learn how to get visibility into your data, and potential threats.
- Get started detecting threats with Microsoft Sentinel.
- Use workbooks to monitor your data.