AWS DVA-C02 Free Practice Questions — Page 2

Developer Associate • 5 questions • Answers & explanations included

Question 6

A developer is creating an AWS CloudFormation template to deploy Amazon EC2 instances across multiple AWS accounts. The developer must choose the EC2 instances from a list of approved instance types. How can the developer incorporate the list of approved instance types in the CloudFormation template?

A. Create a separate CloudFormation template for each EC2 instance type in the list.
B. In the Resources section of the CloudFormation template, create resources for each EC2 instance type in the list.
C. In the CloudFormation template, create a separate parameter for each EC2 instance type in the list.
D. In the CloudFormation template, create a parameter with the list of EC2 instance types as AllowedValues.
Show Answer & Explanation

Correct Answer: D. In the CloudFormation template, create a parameter with the list of EC2 instance types as AllowedValues.

CloudFormation parameters support the AllowedValues property, which constrains the parameter values to a predefined list. By creating a parameter for the instance type and specifying AllowedValues with the list of approved instance types, users can only select from approved options. This is the standard and most efficient approach. Creating separate templates or resources for each type (Options A and B) is highly inefficient and hard to maintain. Creating separate parameters for each type (Option C) doesn't enforce selection from a predefined list.

Question 7

A developer has an application that makes batch requests directly to Amazon DynamoDB by using the BatchGetItem low-level API operation. The responses frequently return values in the UnprocessedKeys element. Which actions should the developer take to increase the resiliency of the application when the batch response includes values in UnprocessedKeys? (Choose two.)

A. Retry the batch operation immediately.
B. Retry the batch operation with exponential backoff and randomized delay.
C. Update the application to use an AWS software development kit (AWS SDK) to make the requests.
D. Increase the provisioned read capacity of the DynamoDB tables that the operation accesses.
E. Increase the provisioned write capacity of the DynamoDB tables that the operation accesses.
Show Answer & Explanation

Correct Answers: B. Retry the batch operation with exponential backoff and randomized delay.; D. Increase the provisioned read capacity of the DynamoDB tables that the operation accesses.

UnprocessedKeys in BatchGetItem responses indicate that some items could not be retrieved, typically due to provisioned throughput being exceeded. Two effective solutions are: (1) Implement exponential backoff with jitter when retrying, which prevents overwhelming the table and allows time for capacity to become available. (2) Increase the provisioned read capacity to handle the request load. Immediate retry (Option A) can worsen throttling. Using an SDK (Option C) doesn't solve the underlying capacity issue. Increasing write capacity (Option E) is irrelevant for BatchGetItem read operations.

Question 8

A company is running a custom application on a set of on-premises Linux servers that are accessed using Amazon API Gateway. AWS X-Ray tracing has been enabled on the API test stage. How can a developer enable X-Ray tracing on the on-premises servers with the LEAST amount of configuration?

A. Install and run the X-Ray SDK on the on-premises servers to capture and relay the data to the X-Ray service.
B. Install and run the X-Ray daemon on the on-premises servers to capture and relay the data to the X-Ray service.
C. Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTraceSegments API call.
D. Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTelemetryRecords API call.
Show Answer & Explanation

Correct Answer: B. Install and run the X-Ray daemon on the on-premises servers to capture and relay the data to the X-Ray service.

The X-Ray daemon is a lightweight application that listens for traffic on UDP port 2000, gathers raw segment data, and relays it to the AWS X-Ray API. Installing the X-Ray daemon on on-premises servers is the simplest way to enable X-Ray tracing with minimal configuration. The daemon handles the communication with X-Ray service automatically. The X-Ray SDK (Option A) instruments code but still requires the daemon to send data. Options C and D require building custom Lambda functions, which adds significant complexity and overhead.

Question 9

A company wants to share information with a third party. The third party has an HTTP API endpoint that the company can use to share the information. The company has the required API key to access the HTTP API. The company needs a way to manage the API key by using code. The integration of the API key with the application code cannot affect application performance. Which solution will meet these requirements MOST securely?

A. Store the API credentials in AWS Secrets Manager. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.
B. Store the API credentials in a local code variable. Push the code to a secure Git repository. Use the local code variable at runtime to make the API call.
C. Store the API credentials as an object in a private Amazon S3 bucket. Restrict access to the S3 object by using IAM policies. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.
D. Store the API credentials in an Amazon DynamoDB table. Restrict access to the table by using resource-based policies. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.
Show Answer & Explanation

Correct Answer: A. Store the API credentials in AWS Secrets Manager. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.

AWS Secrets Manager is the most secure solution for storing and managing API credentials. It provides encryption at rest and in transit, fine-grained access control through IAM, automatic rotation capabilities, and audit logging through CloudTrail. Storing credentials in code (Option B) is a security anti-pattern as credentials can be exposed in version control. S3 (Option C) and DynamoDB (Option D) are not designed for secrets management and lack features like automatic rotation. Secrets Manager retrieval has minimal performance impact.

Question 10

A developer is deploying a new application to Amazon Elastic Container Service (Amazon ECS). The developer needs to securely store and retrieve different types of variables. These variables include authentication information for a remote API, the URL for the API, and credentials. The authentication information and API URL must be available to all current and future deployed versions of the application across development, testing, and production environments. How should the developer retrieve the variables with the FEWEST application changes?

A. Update the application to retrieve the variables from AWS Systems Manager Parameter Store. Use unique paths in Parameter Store for each variable in each environment. Store the credentials in AWS Secrets Manager in each environment.
B. Update the application to retrieve the variables from AWS Key Management Service (AWS KMS). Store the API URL and credentials as unique keys for each environment.
C. Update the application to retrieve the variables from an encrypted file that is stored with the application. Store the API URL and credentials in unique files for each environment.
D. Update the application to retrieve the variables from each of the deployed environments. Define the authentication information and API URL in the ECS task definition as unique names during the deployment process.
Show Answer & Explanation

Correct Answer: A. Update the application to retrieve the variables from AWS Systems Manager Parameter Store. Use unique paths in Parameter Store for each variable in each environment. Store the credentials in AWS Secrets Manager in each environment.

Using AWS Systems Manager Parameter Store for configuration data (API URL, authentication info) combined with AWS Secrets Manager for sensitive credentials provides a secure and manageable solution. Parameter Store allows organizing variables with hierarchical paths (e.g., /dev/api/url, /prod/api/url), making it easy to manage across environments. Secrets Manager provides enhanced security features for credentials. This approach requires minimal application changes and is the AWS recommended pattern. KMS is for key management, not storing variables. Environment files and task definitions don't provide centralized secure storage.

Ready for the Full DVA-C02 Experience?

Access all 112 pages of practice questions, track your progress, and simulate the real exam with timed mode.

Start Interactive Quiz →