Questions and Answers

Question l31aqioTkRWQAivtyPGS

Question

A company wants to migrate data from an Amazon RDS for PostgreSQL DB instance in the eu-east-1 Region of an AWS account named Account_A. The company will migrate the data to an Amazon Redshift cluster in the eu-west-1 Region of an AWS account named Account_B.

Which solution will give AWS Database Migration Service (AWS DMS) the ability to replicate data between two data stores?

Choices

  • A: Set up an AWS DMS replication instance in Account_B in eu-west-1.
  • B: Set up an AWS DMS replication instance in Account_B in eu-east-1.
  • C: Set up an AWS DMS replication instance in a new AWS account in eu-west-1.
  • D: Set up an AWS DMS replication instance in Account_A in eu-east-1.

Question 7uOuU4R8Vo5senzokcIJ

Question

A company uses Amazon S3 as a data lake. The company sets up a data warehouse by using a multi-node Amazon Redshift cluster. The company organizes the data files in the data lake based on the data source of each data file.

The company loads all the data files into one table in the Redshift cluster by using a separate COPY command for each data file location. This approach takes a long time to load all the data files into the table. The company must increase the speed of the data ingestion. The company does not want to increase the cost of the process.

Which solution will meet these requirements?

Choices

  • A: Use a provisioned Amazon EMR cluster to copy all the data files into one folder. Use a COPY command to load the data into Amazon Redshift.
  • B: Load all the data files in parallel into Amazon Aurora. Run an AWS Glue job to load the data into Amazon Redshift.
  • C: Use an AWS Give job to copy all the data files into one folder. Use a COPY command to load the data into Amazon Redshift.
  • D: Create a manifest file that contains the data file locations. Use a COPY command to load the data into Amazon Redshift.

Question 0sdQYGEysOQtxIPpwl2p

Question

A company plans to use Amazon Kinesis Data Firehose to store data in Amazon S3. The source data consists of 2 MB .csv files. The company must convert the .csv files to JSON format. The company must store the files in Apache Parquet format.

Which solution will meet these requirements with the LEAST development effort?

Choices

  • A: Use Kinesis Data Firehose to convert the .csv files to JSON. Use an AWS Lambda function to store the files in Parquet format.
  • B: Use Kinesis Data Firehose to convert the .csv files to JSON and to store the files in Parquet format.
  • C: Use Kinesis Data Firehose to invoke an AWS Lambda function that transforms the .csv files to JSON and stores the files in Parquet format.
  • D: Use Kinesis Data Firehose to invoke an AWS Lambda function that transforms the .csv files to JSON. Use Kinesis Data Firehose to store the files in Parquet format.

Question JyFfEYxpglpOUutU8ppS

Question

A company is using an AWS Transfer Family server to migrate data from an on-premises environment to AWS. Company policy mandates the use of TLS 1.2 or above to encrypt the data in transit.

Which solution will meet these requirements?

Choices

  • A: Generate new SSH keys for the Transfer Family server. Make the old keys and the new keys available for use.
  • B: Update the security group rules for the on-premises network to allow only connections that use TLS 1.2 or above.
  • C: Update the security policy of the Transfer Family server to specify a minimum protocol version of TLS 1.2
  • D: Install an SSL certificate on the Transfer Family server to encrypt data transfers by using TLS 1.2.

Question emo1vlLPmP006nGuf541

Question

A company wants to migrate an application and an on-premises Apache Kafka server to AWS. The application processes incremental updates that an on-premises Oracle database sends to the Kafka server. The company wants to use the replatform migration strategy instead of the refactor strategy.

Which solution will meet these requirements with the LEAST management overhead?

Choices

  • A: Amazon Kinesis Data Streams
  • B: Amazon Managed Streaming for Apache Kafka (Amazon MSK) provisioned cluster
  • C: Amazon Kinesis Data Firehose
  • D: Amazon Managed Streaming for Apache Kafka (Amazon MSK) Serverless