Questions and Answers
Question aCHRlgy876Fl9fQ6qhZ7
Question
A data engineer is building a data orchestration workflow. The data engineer plans to use a hybrid model that includes some on-premises resources and some resources that are in the cloud. The data engineer wants to prioritize portability and open source resources.
Which service should the data engineer use in both the on-premises environment and the cloud-based environment?
Choices
- A: AWS Data Exchange
- B: Amazon Simple Workflow Service (Amazon SWF)
- C: Amazon Managed Workflows for Apache Airflow (Amazon MWAA)
- D: AWS Glue
answer?
Answer: C Answer_ET: C Community answer C (100%) Discussion
Comment 1250090 by andrologin
- Upvotes: 1
Selected Answer: C AMWAA is just Apache Airflow managed by AWS. Using it means you can use the same dags for your on-premises solution that you also use on cloud
Comment 1241464 by Ja13
- Upvotes: 2
Selected Answer: C C. Amazon Managed Workflows for Apache Airflow (Amazon MWAA)
Amazon MWAA is a managed service for Apache Airflow, which is an open-source workflow automation tool. Apache Airflow can be used both on-premises and in the cloud, making it ideal for hybrid environments. Using Amazon MWAA allows the data engineer to leverage the managed service in the cloud while maintaining the ability to use the same open-source Airflow setup on-premises, ensuring portability and consistency across environments.
Comment 1235877 by sdas1
- Upvotes: 3
Answer is C
Comment 1230864 by tgv
- Upvotes: 2
Selected Answer: C Airflow can orchestrate workflows that involve both on-premises and cloud resources, making it ideal for hybrid models.
Question WZ5t3IBLuj6gsXfVfRMq
Question
A gaming company uses a NoSQL database to store customer information. The company is planning to migrate to AWS.
The company needs a fully managed AWS solution that will handle high online transaction processing (OLTP) workload, provide single-digit millisecond performance, and provide high availability around the world.
Which solution will meet these requirements with the LEAST operational overhead?
Choices
- A: Amazon Keyspaces (for Apache Cassandra)
- B: Amazon DocumentDB (with MongoDB compatibility)
- C: Amazon DynamoDB
- D: Amazon Timestream
answer?
Answer: C Answer_ET: C Community answer C (100%) Discussion
Comment 1230629 by JUNGAWS
- Upvotes: 7
Selected Answer: C provide single-digit millisecond performance ⇒ DynamoDB
Comment 1246122 by GustonMari
- Upvotes: 1
Also RDS and DynamoDB = OLTP
Comment 1233701 by HunkyBunky
- Upvotes: 1
Selected Answer: C No doubt - DynamoDB
Comment 1230865 by tgv
- Upvotes: 2
Selected Answer: C Amazon DynamoDB is the most appropriate choice given the company’s requirements for high performance, global availability, and minimal operational overhead.
Comment 1230628 by JUNGAWS
- Upvotes: 3
C provide single-digit millisecond performance ⇒ DynamoDB
Question MyrnyO7hkUzuUHG3P3gh
Question
A data engineer creates an AWS Lambda function that an Amazon EventBridge event will invoke. When the data engineer tries to invoke the Lambda function by using an EventBridge event, an AccessDeniedException message appears.
How should the data engineer resolve the exception?
Choices
- A: Ensure that the trust policy of the Lambda function execution role allows EventBridge to assume the execution role.
- B: Ensure that both the IAM role that EventBridge uses and the Lambda function’s resource-based policy have the necessary permissions.
- C: Ensure that the subnet where the Lambda function is deployed is configured to be a private subnet.
- D: Ensure that EventBridge schemas are valid and that the event mapping configuration is correct.
answer?
Answer: B Answer_ET: B Community answer B (89%) 11% Discussion
Comment 1232058 by artworkad
- Upvotes: 5
Selected Answer: B The lambda resource based policy must allow the events principle to invoke the lambda function. https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-run-lambda-schedule.html#eb-schedule-create-rule and https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-run-lambda-schedule.html#eb-schedule-create-rule Amazon SQS, Amazon SNS, Lambda, CloudWatch Logs, and EventBridge bus targets do not use roles, and permissions to EventBridge must be granted via a resource policy.
Comment 1266132 by Shanmahi
- Upvotes: 2
Selected Answer: B Option B
Comment 1236236 by HunkyBunky
- Upvotes: 3
Selected Answer: B Only B - makes sense
Comment 1235560 by rpwags
- Upvotes: 3
Selected Answer: B “B” is corect because the only way to resolve the AccessDeniedException message is to make sure both the IAM role for EventBridge and the Lambda function’s resource-based policy have the necessary permissions.
Comment 1231208 by GHill1982
- Upvotes: 2
Selected Answer: A The trust policy is what grants an AWS service permission to use the role on behalf of the user. Without this trust relationship, EventBridge won’t have the necessary permissions to invoke the Lambda function.
Comment 1230866 by tgv
- Upvotes: 3
Selected Answer: B IAM Role for EventBridge: EventBridge needs permission to invoke the Lambda function. Lambda Resource-Based Policy: The Lambda function must have a resource-based policy that allows EventBridge to invoke it.
Question DoIVQnATYreevwwlCl2h
Question
A company uses a data lake that is based on an Amazon S3 bucket. To comply with regulations, the company must apply two layers of server-side encryption to files that are uploaded to the S3 bucket. The company wants to use an AWS Lambda function to apply the necessary encryption.
Which solution will meet these requirements?
Choices
- A: Use both server-side encryption with AWS KMS keys (SSE-KMS) and the Amazon S3 Encryption Client.
- B: Use dual-layer server-side encryption with AWS KMS keys (DSSE-KMS).
- C: Use server-side encryption with customer-provided keys (SSE-C) before files are uploaded.
- D: Use server-side encryption with AWS KMS keys (SSE-KMS).
answer?
Answer: B Answer_ET: B Community answer B (89%) 11% Discussion
Comment 1235879 by sdas1
- Upvotes: 5
Answer is B
Comment 1269897 by samadal
- Upvotes: 3
Selected Answer: B The most crucial objective in the problem is “Two layers of server-side encryption must be applied.”
A: SSE-KMS is a single-layer server-side encryption that uses AWS KMS keys to encrypt data. The Amazon S3 Encryption Client performs client-side encryption, not server-side encryption. C: SSE-C is server-side encryption that uses customer-provided encryption keys to encrypt data. This does not provide two layers of encryption. D: SSE-KMS is a single-layer server-side encryption. It does not meet the encryption requirement of two layers of encryption.
B: DSSE-KMS (dual-layer server-side encryption) uses two layers of encryption to encrypt data using keys managed by AWS KMS. The first layer is used to encrypt the data key, and the second layer is used to encrypt the actual data. This provides the two layers of server-side encryption required to meet compliance requirements.
Comment 1241495 by Ja13
- Upvotes: 2
Selected Answer: B B. Use dual-layer server-side encryption with AWS KMS keys (DSSE-KMS).
Dual-layer server-side encryption with AWS KMS keys (DSSE-KMS) is specifically designed to apply two layers of encryption to meet regulatory compliance requirements. This ensures that each object stored in Amazon S3 is encrypted twice, providing the additional security layer that the company needs.
Comment 1241078 by bakarys
- Upvotes: 1
Selected Answer: A The solution that will meet these requirements is Option A: Use both server-side encryption with AWS KMS keys (SSE-KMS) and the Amazon S3 Encryption Client.
This approach provides two layers of encryption. The first layer is the server-side encryption with AWS KMS keys (SSE-KMS), which encrypts the data at rest. The second layer is the client-side encryption using the Amazon S3 Encryption Client before the data is uploaded to S3. This way, the data is already encrypted when it arrives at S3 and then it gets encrypted again by S3, thus providing two layers of encryption.
The other options are not as suitable:
Option B: There’s no such thing as dual-layer server-side encryption with AWS KMS keys (DSSE-KMS). Option C: Server-side encryption with customer-provided keys (SSE-C) only provides one layer of encryption. Option D: Server-side encryption with AWS KMS keys (SSE-KMS) also only provides one layer of encryption
Comment 1236483 by sdas1
- Upvotes: 1
Using dual-layer server-side encryption with AWS Key Management Service (AWS KMS) keys (DSSE-KMS) applies two layers of encryption to objects when they are uploaded to Amazon S3. DSSE-KMS helps you more easily fulfill compliance standards that require you to apply multilayer encryption to your data and have full control of your encryption keys.
Comment 1236255 by HunkyBunky
- Upvotes: 2
Selected Answer: B I guess that right answer is - B https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingDSSEncryption.html
Comment 1235878 by sdas1
- Upvotes: 1
Answer is D
Comment 1230867 by tgv
- Upvotes: 1
Selected Answer: B https://docs.aws.amazon.com/AmazonS3/latest/userguide/specifying-dsse-encryption.html
Question bD1kMAbR96kexYepCQoq
Question
A data engineer notices that Amazon Athena queries are held in a queue before the queries run.
How can the data engineer prevent the queries from queueing?
Choices
- A: Increase the query result limit.
- B: Configure provisioned capacity for an existing workgroup.
- C: Use federated queries.
- D: Allow users who run the Athena queries to an existing workgroup.
answer?
Answer: B Answer_ET: B Community answer B (100%) Discussion
Comment 1241496 by Ja13
- Upvotes: 2
Selected Answer: B B. Configure provisioned capacity for an existing workgroup.
Provisioned capacity in Amazon Athena allows you to allocate dedicated query processing capacity to a workgroup. This helps ensure that your queries are run without being held in a queue, providing more consistent and predictable performance.
Comment 1240832 by HunkyBunky
- Upvotes: 1
Selected Answer: B In my opinion - only B - makes sense
Comment 1239059 by Bmaster
- Upvotes: 3
B is good. https://aws.amazon.com/blogs/aws/introducing-athena-provisioned-capacity/
Comment 1230868 by tgv
- Upvotes: 2
Selected Answer: B Provisioning capacity ensures that there are sufficient dedicated resources available to handle the query load, thereby preventing queries from being held in a queue. This approach directly addresses the issue by increasing the available processing capacity for your queries.