Streaming data to tables with Amazon Data Firehose
Amazon Data Firehose is a fully managed service for delivering real-time streaming data
Complete these steps to set up Firehose streaming to tables in S3 table buckets:
-
Configure Firehose to deliver data into your S3 tables. To do so, you create an AWS Identity and Access Management (IAM) service role that allows Firehose to access your tables.
-
Grant the Firehose service role explicit permissions to your table or table's namespace. For more information, see Grant Lake Formation permissions on your table resources.
Creating a role for Firehose to use S3 tables as a destination
Firehose needs an IAM service role with specific permissions to access AWS Glue tables and write data to S3 tables. You need this provide this IAM role when you create a Firehose stream.
Open the IAM console at https://console.aws.amazon.com/iam/
. -
In the left navigation pane, choose Policies
-
Choose Create a policy, and choose JSON in policy editor.
-
Add the following inline policy that grants permissions to all databases and tables in your data catalog. If you want, you can give permissions only to specific tables and databases. To use this policy, replace the
with your own information.user input placeholders
{ "Version": "2012-10-17", "Statement": [ { "Sid": "S3TableAccessViaGlueFederation", "Effect": "Allow", "Action": [ "glue:GetTable", "glue:GetDatabase", "glue:UpdateTable" ], "Resource": [ "arn:aws:glue:
region
:account-id
:catalog/s3tablescatalog/*", "arn:aws:glue:region
:account-id
:catalog/s3tablescatalog", "arn:aws:glue:region
:account-id
:catalog", "arn:aws:glue:region
:account-id
:database/*", "arn:aws:glue:region
:account-id
:table/*/*" ] }, { "Sid": "S3DeliveryErrorBucketPermission", "Effect": "Allow", "Action": [ "s3:AbortMultipartUpload", "s3:GetBucketLocation", "s3:GetObject", "s3:ListBucket", "s3:ListBucketMultipartUploads", "s3:PutObject" ], "Resource": [ "arn:aws:s3:::error delivery bucket
", "arn:aws:s3:::error delivery bucket
/*" ] }, { "Sid": "RequiredWhenUsingKinesisDataStreamsAsSource", "Effect": "Allow", "Action": [ "kinesis:DescribeStream", "kinesis:GetShardIterator", "kinesis:GetRecords", "kinesis:ListShards" ], "Resource": "arn:aws:kinesis:region
:account-id
:stream/stream-name
" }, { "Sid": "RequiredWhenDoingMetadataReadsANDDataAndMetadataWriteViaLakeformation", "Effect": "Allow", "Action": [ "lakeformation:GetDataAccess" ], "Resource": "*" }, { "Sid": "RequiredWhenUsingKMSEncryptionForS3ErrorBucketDelivery", "Effect": "Allow", "Action": [ "kms:Decrypt", "kms:GenerateDataKey" ], "Resource": [ "arn:aws:kms:region
:account-id
:key/KMS-key-id
" ], "Condition": { "StringEquals": { "kms:ViaService": "s3.region
.amazonaws.com" }, "StringLike": { "kms:EncryptionContext:aws:s3:arn": "arn:aws:s3:::error delivery bucket
/prefix*" } } }, { "Sid": "LoggingInCloudWatch", "Effect": "Allow", "Action": [ "logs:PutLogEvents" ], "Resource": [ "arn:aws:logs:region
:account-id
:log-group:log-group-name
:log-stream:log-stream-name
" ] }, { "Sid": "RequiredWhenAttachingLambdaToFirehose", "Effect": "Allow", "Action": [ "lambda:InvokeFunction", "lambda:GetFunctionConfiguration" ], "Resource": [ "arn:aws:lambda:region
:account-id
:function:function-name
:function-version
" ] } ] }This policy has a statements that allow access to Kinesis Data Streams, invoking Lambda functions and access to AWS KMS keys. If you don't use any of these resources, you can remove the respective statements.
If error logging is enabled, Firehose also sends data delivery errors to your CloudWatch log group and streams. For this, you must configure log group and log stream names. For log group and log stream names, see Monitor Amazon Data Firehose Using CloudWatch Logs.
-
After you create the policy, create an IAM role with AWS service as the Trusted entity type.
-
For Service or use case, choose Kinesis. For Use case choose Kinesis Firehose.
-
Choose Next, and then select the policy you created earlier.
-
Give your role a name. Review your role details, and choose Create role. The role will have the following trust policy.
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "sts:AssumeRole" ], "Principal": { "Service": [ "firehose.amazonaws.com" ] } } ] }
Creating a Firehose stream to S3 tables
The following procedure shows how to create a Firehose stream to deliver data to S3 tables using the console. The following prerequisites are required to set up a Firehose stream to S3 tables.
Prerequisites
-
Create the Role for Firehose to access S3 Tables.
Grant Lake Formation permissions to the Firehose service role you created to access tables.
To provide routing information to Firehose when you configure a stream, you use your namespace as the database name and the name of a table in that namespace. You can use these values in the Unique key section of a Firehose stream configuration to route data to a single table. You can also use this values to route to a table using JSON Query expressions. For more information, see Route incoming records to a single Iceberg table.
To set up a Firehose stream to S3 tables (Console)
Open the Firehose console at https://console.aws.amazon.com/firehose/
. -
Choose Create Firehose stream.
-
For Source, choose one of the following sources:
-
Amazon Kinesis Data Streams
-
Amazon MSK
-
Direct PUT
-
-
For Destination, choose Apache Iceberg Tables.
-
Enter a Firehose stream name.
-
Configure your Source settings.
-
For Destination settings, choose Current account to stream to tables in your account or Cross-account for tables in another account.
For tables in the Current account, select your S3 Tables catalog from the Catalog dropdown.
For tables in a Cross-account, enter the Catalog ARN of the catalog you want to stream to in another account.
-
Configure database and table names using Unique Key configuration, JSONQuery expressions, or in a Lambda function. For more information, refer to Route incoming records to a single Iceberg table and Route incoming records to different Iceberg tables in the Amazon Data Firehose Developer Guide.
-
Under Backup settings, specify a S3 backup bucket.
-
For Existing IAM roles under Advanced settings, select the IAM role you created for Firehose.
Choose Create Firehose stream.
For more information about the other settings that you can configure for a stream, see Set up the Firehose stream in the Amazon Data Firehose Developer Guide.