AWS Certified Developer Associate (#152)

A company needs to ingest terabytes of data each hour from thousands of sources that are delivered almost continually throughout the day. The volume of messages generated varies over the course of the day. Messages must be delivered in real time for fraud detection and live operational dashboards. Which approach will meet these requirements?

Send the messages to an Amazon SQS queue, then process the messages by using a fleet of Amazon EC2 instances
Use the Amazon S3 API to write messages to an S3 bucket, then process the messages by using Amazon Redshift
Use AWS Data Pipeline to automate the movement and transformation of data
Use Amazon Kinesis Data Streams with Kinesis Client Library to ingest and deliver messages