Questions and Answers

Question 2xBHrM8mcQSpoeX2vlOi

Question

A company has a data warehouse in Amazon Redshift. To comply with security regulations, the company needs to log and store all user activities and connection activities for the data warehouse.

Which solution will meet these requirements?

Choices

  • A: Create an Amazon S3 bucket. Enable logging for the Amazon Redshift cluster. Specify the S3 bucket in the logging configuration to store the logs.
  • B: Create an Amazon Elastic File System (Amazon EFS) file system. Enable logging for the Amazon Redshift cluster. Write logs to the EFS file system.
  • C: Create an Amazon Aurora MySQL database. Enable logging for the Amazon Redshift cluster. Write the logs to a table in the Aurora MySQL database.
  • D: Create an Amazon Elastic Block Store (Amazon EBS) volume. Enable logging for the Amazon Redshift cluster. Write the logs to the EBS volume.

Question YkZVeKNgXgYbakqx5Ogr

Question

A company wants to migrate a data warehouse from Teradata to Amazon Redshift.

Which solution will meet this requirement with the LEAST operational effort?

Choices

  • A: Use AWS Database Migration Service (AWS DMS) Schema Conversion to migrate the schema. Use AWS DMS to migrate the data.
  • B: Use the AWS Schema Conversion Tool (AWS SCT) to migrate the schema. Use AWS Database Migration Service (AWS DMS) to migrate the data.
  • C: Use AWS Database Migration Service (AWS DMS) to migrate the data. Use automatic schema conversion.
  • D: Manually export the schema definition from Teradata. Apply the schema to the Amazon Redshift database. Use AWS Database Migration Service (AWS DMS) to migrate the data.

Question pAnjHJOVQ4tQX43oiNhS

Question

A company uses a variety of AWS and third-party data stores. The company wants to consolidate all the data into a central data warehouse to perform analytics. Users need fast response times for analytics queries.

The company uses Amazon QuickSight in direct query mode to visualize the data. Users normally run queries during a few hours each day with unpredictable spikes.

Which solution will meet these requirements with the LEAST operational overhead?

Choices

  • A: Use Amazon Redshift Serverless to load all the data into Amazon Redshift managed storage (RMS).
  • B: Use Amazon Athena to load all the data into Amazon S3 in Apache Parquet format.
  • C: Use Amazon Redshift provisioned clusters to load all the data into Amazon Redshift managed storage (RMS).
  • D: Use Amazon Aurora PostgreSQL to load all the data into Aurora.

Question habnnfQqkZHi1CGZn1D2

Question

A data engineer uses Amazon Kinesis Data Streams to ingest and process records that contain user behavior data from an application every day.

The data engineer notices that the data stream is experiencing throttling because hot shards receive much more data than other shards in the data stream.

How should the data engineer resolve the throttling issue?

Choices

  • A: Use a random partition key to distribute the ingested records.
  • B: Increase the number of shards in the data stream. Distribute the records across the shards.
  • C: Limit the number of records that are sent each second by the producer to match the capacity of the stream.
  • D: Decrease the size of the records that the producer sends to match the capacity of the stream.

Question 9jePS18rrXAiyxlA4QxR

Question

A company has a data processing pipeline that includes several dozen steps. The data processing pipeline needs to send alerts in real time when a step fails or succeeds. The data processing pipeline uses a combination of Amazon S3 buckets, AWS Lambda functions, and AWS Step Functions state machines.

A data engineer needs to create a solution to monitor the entire pipeline.

Which solution will meet these requirements?

Choices

  • A: Configure the Step Functions state machines to store notifications in an Amazon S3 bucket when the state machines finish running. Enable S3 event notifications on the S3 bucket.
  • B: Configure the AWS Lambda functions to store notifications in an Amazon S3 bucket when the state machines finish running. Enable S3 event notifications on the S3 bucket.
  • C: Use AWS CloudTrail to send a message to an Amazon Simple Notification Service (Amazon SNS) topic that sends notifications when a state machine fails to run or succeeds to run.
  • D: Configure an Amazon EventBridge rule to react when the execution status of a state machine changes. Configure the rule to send a message to an Amazon Simple Notification Service (Amazon SNS) topic that sends notifications.