How Many Questions Of SAA-C03 Dumps
100% Correct of SAA-C03 real exam materials and simulations for Amazon-Web-Services certification for examinee, Real Success Guaranteed with Updated SAA-C03 pdf dumps vce Materials. 100% PASS AWS Certified Solutions Architect - Associate (SAA-C03) exam Today!
Free SAA-C03 Demo Online For Amazon-Web-Services Certifitcation:
NEW QUESTION 1
A company runs a photo processing application mat needs to frequently upload and download pictures from Amazon S3 buckets that are located in the same AWS Region A solutions architect has noticed an increased cost in data transfer lees and needs to implement a solution to reduce these costs
How can the solutions architect meet this requirement?
- A. Deploy Amazon API Gateway into a public subnet and adjust the route table to route S3 calls through it
- B. Deploy a NAT gateway into a public subnet and attach an endpoint policy that allows access to the S3 buckets
- C. Deploy the application into a public subnet and allow it to route through an internet gateway to access the S3 buckets
- D. Deploy an S3 VPC gateway endpoint into the VPC and attach an endpoint policy that allows access to the S3 buckets
Answer: D
NEW QUESTION 2
A solutions architect is creating a new Amazon CloudFront distribution for an application. Some of the information submitted by users is sensitive. The application uses HTTPS but needs another layer of security. The sensitive information should be protected throughout the entire application stack, and access to the information should be restricted to certain applications.
Which action should the solutions architect take?
- A. Configure a CloudFront signed URL.
- B. Configure a CloudFront signed cookie.
- C. Configure a CloudFront field-level encryption profile.
- D. Configure CloudFront and set the Origin Protocol Policy setting to HTTPS Only for the Viewer Protocol Policy.
Answer: C
Explanation:
Explanation
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/field-level-encryption.html
"With Amazon CloudFront, you can enforce secure end-to-end connections to origin servers by using HTTPS. Field-level encryption adds an additional layer of security that lets you protect specific data throughout system processing so that only certain applications can see it."
NEW QUESTION 3
A company needs to review its AWS Cloud deployment to ensure that its Amazon S3 buckets do not have unauthorized configuration changes.
What should a solutions architect do to accomplish this goal?
- A. Turn on AWS Config with the appropriate rules.
- B. Turn on AWS Trusted Advisor with the appropriate checks.
- C. Turn on Amazon Inspector with the appropriate assessment template.
- D. Turn on Amazon S3 server access loggin
- E. Configure Amazon EventBridge (Amazon Cloud Watch Events).
Answer: A
NEW QUESTION 4
A company maintains a searchable repository of items on its website. The data is stored in an Amazon RDS for MySQL database table that contains more than 10 million rows The database has 2 TB of General Purpose SSD storage There are millions of updates against this data every day through the company's website
The company has noticed that some insert operations are taking 10 seconds or longer The company has determined that the database storage performance is the problem
Which solution addresses this performance issue?
- A. Change the storage type to Provisioned IOPS SSD
- B. Change the DB instance to a memory optimized instance class
- C. Change the DB instance to a burstable performance instance class
- D. Enable Multi-AZ RDS read replicas with MySQL native asynchronous replication.
Answer: A
Explanation:
Explanation
https://aws.amazon.com/ebs/features/
"Provisioned IOPS volumes are backed by solid-state drives (SSDs) and are the highest performance EBS volumes designed for your critical, I/O intensive database applications. These volumes are ideal for both IOPS-intensive and throughput-intensive workloads that require extremely low latency."
NEW QUESTION 5
A company wants to create a mobile app that allows users to stream slow-motion video clips on their mobile devices Currently, the app captures video clips and uploads the video clips in raw format into an Amazon S3 bucket The app retrieves these video clips directly from the S3 bucket. However the videos are large in their raw format.
Users are experiencing issues with buffering and playback on mobile devices. The company wants to implement solutions to maximize the performance and scalability of the app while minimizing operational overhead
Which combination of solutions will meet these requirements? (Select TWO.)
- A. Deploy Amazon CloudFront for content delivery and caching
- B. Use AWS DataSync to replicate the video files across AWS Regions in other S3 buckets
- C. Use Amazon Elastic Transcoder to convert the video files to more appropriate formats
- D. Deploy an Auto Scaling group of Amazon EC2 instances in Local Zones for content delivery and caching
- E. Deploy an Auto Scaling group of Amazon EC2 instances to convert the video files to more appropriate formats
Answer: CD
NEW QUESTION 6
A company stores call transcript files on a monthly basis. Users access the files randomly within 1 year of the call, but users access the files infrequently after 1 year. The company wants to optimize its solution by giving users the ability to query and retrieve files that are less than 1-year-old as quickly as possible. A delay in retrieving older files is acceptable.
Which solution will meet these requirements MOST cost-effectively?
- A. Store individual files with tags in Amazon S3 Glacier Instant Retrieva
- B. Query the tags to retrieve the files from S3 Glacier Instant Retrieval.
- C. Store individual files in Amazon S3 Intelligent-Tierin
- D. Use S3 Lifecycle policies to move the files to S3 Glacier Flexible Retrieval after 1 yea
- E. Query and retrieve the files that are in Amazon S3 by using Amazon Athen
- F. Query and retrieve the files that are in S3 Glacier by using S3 Glacier Select.
- G. Store individual files with tags in Amazon S3 Standard storag
- H. Store search metadata for each archive in Amazon S3 Standard storag
- I. Use S3 Lifecycle policies to move the files to S3 Glacier Instant Retrieval after 1 yea
- J. Query and retrieve the files by searching for metadata from Amazon S3.
- K. Store individual files in Amazon S3 Standard storag
- L. Use S3 Lifecycle policies to move the files to S3 Glacier Deep Archive after 1 yea
- M. Store search metadata in Amazon RD
- N. Query the files from Amazon RD
- O. Retrieve the files from S3 Glacier Deep Archive.
Answer: C
NEW QUESTION 7
A company is launching a new application and will display application metrics on an Amazon CloudWatch dashboard. The company’s product manager needs to access this dashboard periodically. The product manager does not have an AWS account. A solution architect must provide access to the product manager by following the principle of least privilege.
Which solution will meet these requirements?
- A. Share the dashboard from the CloudWatch consol
- B. Enter the product manager’s email address, and complete the sharing step
- C. Provide a shareable link for the dashboard to the product manager.
- D. Create an IAM user specifically for the product manage
- E. Attach the CloudWatch Read Only Access managed policy to the use
- F. Share the new login credential with the product manage
- G. Share the browser URL of the correct dashboard with the product manager.
- H. Create an IAM user for the company’s employees, Attach the View Only Access AWS managed policy to the IAM use
- I. Share the new login credentials with the product manage
- J. Ask the product manager to navigate to the CloudWatch console and locate the dashboard by name in the Dashboards section.
- K. Deploy a bastion server in a public subne
- L. When the product manager requires access to the dashboard, start the server and share the RDP credential
- M. On the bastion server, ensure that the browser is configured to open the dashboard URL with cached AWS credentials that have appropriate permissions to view the dashboard.
Answer: A
NEW QUESTION 8
A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of terabytes The application data must be stored in a standard file system structure The company wants a solution that scales automatically, is highly available, and requires minimum operational overhead.
Which solution will meet these requirements?
- A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage
- B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage
- C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
- D. Use Amazon Elastic File System (Amazon EFS) for storage.
- E. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
- F. Use Amazon Elastic Block Store (Amazon EBS) for storage.
Answer: C
NEW QUESTION 9
An online photo application lets users upload photos and perform image editing operations The application offers two classes of service free and paid Photos submitted by paid users are processed before those submitted by free users Photos are uploaded to Amazon S3 and the job information is sent to Amazon SQS.
Which configuration should a solutions architect recommend?
- A. Use one SQS FIFO queue Assign a higher priority to the paid photos so they are processed first
- B. Use two SQS FIFO queues: one for paid and one for free Set the free queue to use short polling and the paid queue to use long polling
- C. Use two SQS standard queues one for paid and one for free Configure Amazon EC2 instances to prioritize polling for the paid queue over the free queue.
- D. Use one SQS standard queu
- E. Set the visibility timeout of the paid photos to zero Configure Amazon EC2 instances to prioritize visibility settings so paid photos are processed first
Answer: C
Explanation:
https://acloud.guru/forums/guru-of-the-week/discussion/-L7Be8rOao3InQxdQcXj/ https://aws.amazon.com/sqs/features/
Priority: Use separate queues to provide prioritization of work. https://aws.amazon.com/sqs/features/ https://aws.amazon.com/sqs/features/#:~:text=Priority%3A%20Use%20separate%20queues%20to%20provide% https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-short-and-long-polling.
NEW QUESTION 10
A company is running an ASP.NET MVC application on a single Amazon EC2 instance. A recent increase in application traffic is causing slow response times for users during lunch hours. The company needs to resolve this concern with the least amount of configuration.
What should a solutions architect recommend to meet these requirements?
- A. Move the application to AWS Elastic Beanstal
- B. Configure load-based auto scaling and time-based scaling to handle scaling during lunch hours
- C. Move the application to Amazon Elastic Container Service (Amazon ECS) Create an AWS Lambda function to handle scaling during lunch hours.
- D. Move the application to Amazon Elastic Container Service (Amazon ECS). Configure scheduled scaling for AWS Application Auto Scaling during lunch hours.
- E. Move the application to AWS Elastic Beanstal
- F. Configure load-based auto scaling, and create an AWS Lambda function to handle scaling during lunch hours.
Answer: A
Explanation:
- Scheduled scaling is the solution here, while "using the least amount of settings possible" - Beanstalk vs moving to ECS - ECS requires MORE CONFIGURATION / SETTINGS (task and service definitions, configuring ECS container agent) than Beanstalk (upload application code)
https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/environments-cfg-autoscaling-scheduledactions.html Elastic Beanstalk supports time based scaling, since we are aware that the application performance slows down during the lunch hours.
https://aws.amazon.com/about-aws/whats-new/2015/05/aws-elastic-beanstalk-supports-time-based-scaling/
NEW QUESTION 11
A company is implementing a new business application The application runs on two Amazon EC2 instances and uses an Amazon S3 bucket for document storage A solutions architect needs to ensure that the EC? instances can access the S3 bucket
What should the solutions architect do to moot this requirement?
- A. Create an IAM role that grants access to the S3 bucke
- B. Attach the role to the EC2 Instances.
- C. Create an IAM policy that grants access to the S3 bucket Attach the policy to the EC2 Instances
- D. Create an IAM group that grants access to the S3 bucket Attach the group to the EC2 instances
- E. Create an IAM user that grants access to the S3 bucket Attach the user account to the EC2 Instances
Answer: C
NEW QUESTION 12
A company wants to manage Amazon Machine Images (AMls). The company currently copies AMls to the same AWS Region where the AMls were created. The company needs to design an application that captures AWS API calls and sends alerts whenever the Amazon EC2 Createlmage API operation is called within the company's account
Which solution will meet these requirements with the LEAST operational overhead?
- A. Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a Createlmage API call is detected
- B. Configure AWS CloudTrail with an Amazon Simple Notification Sen/ice (Amazon SNS) notification that occurs when updated logs are sent to Amazon S3 Use Amazon Athena to create a new table and to query on Createlmage when an API call is detected
- C. Create an Amazon EventBndge (Amazon CloudWatch Events) rule for the Createlmage API call Configure the target as an Amazon Simple Notification Service (Amazon SNS) topic to send an alert when a Createlmage API call is detected
- D. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail logs Create an AWS Lambda function to send an alert to an Amazon Simple Notification Service (Amazon SNS) topic when a Createlmage API call is detected
Answer: B
NEW QUESTION 13
A business's backup data totals 700 terabytes (TB) and is kept in network attached storage (NAS) at its data center. This backup data must be available in the event of occasional regulatory inquiries and preserved for a period of seven years. The organization has chosen to relocate its backup data from its on-premises data center to Amazon Web Services (AWS). Within one month, the migration must be completed. The company's public internet connection provides 500 Mbps of dedicated capacity for data transport.
What should a solutions architect do to ensure that data is migrated and stored at the LOWEST possible cost?
- A. Order AWS Snowball devices to transfer the dat
- B. Use a lifecycle policy to transition the files to Amazon S3 Glacier Deep Archive.
- C. Deploy a VPN connection between the data center and Amazon VP
- D. Use the AWS CLI to copy the data from on premises to Amazon S3 Glacier.
- E. Provision a 500 Mbps AWS Direct Connect connection and transfer the data to Amazon S3. Use a lifecycle policy to transition the files to Amazon S3 Glacier Deep Archive.
- F. Use AWS DataSync to transfer the data and deploy a DataSync agent on premise
- G. Use the DataSync task to copy files from the on-premises NAS storage to Amazon S3 Glacier.
Answer: A
NEW QUESTION 14
A company is migrating its on-premises PostgreSQL database to Amazon Aurora PostgreSQL. The
on-premises database must remain online and accessible during the migration. The Aurora database must remain synchronized with the on-premises database.
Which combination of actions must a solutions architect take to meet these requirements? (Select TWO.)
- A. Create an ongoing replication task.
- B. Create a database backup of the on-premises database
- C. Create an AWS Database Migration Service (AWS DMS) replication server
- D. Convert the database schema by using the AWS Schema Conversion Tool (AWS SCT).
- E. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to monitor the database synchronization
Answer: CD
NEW QUESTION 15
A company needs to ingested and handle large amounts of streaming data that its application generates. The application runs on Amazon EC2 instances and sends data to Amazon Kinesis Data Streams. which is contained wild default settings. Every other day the application consumes the data and writes the data to an Amazon S3 bucket for business intelligence (BI) processing the company observes that Amazon S3 is not receiving all the data that trio application sends to Kinesis Data Streams.
What should a solutions architect do to resolve this issue?
- A. Update the Kinesis Data Streams default settings by modifying the data retention period.
- B. Update the application to use the Kinesis Producer Library (KPL) lo send the data to Kinesis Data Streams.
- C. Update the number of Kinesis shards lo handle the throughput of me data that is sent to Kinesis Data Streams.
- D. Turn on S3 Versioning within the S3 bucket to preserve every version of every object that is ingested in the S3 bucket.
Answer: A
NEW QUESTION 16
A company is launching a new application and will display application metrics on an Amazon CloudWatch dashboard. The company’s product manager needs to access this dashboard periodically. The product manager does not have an AWS account. A solution architect must provide access to the product manager by following the principle of least privilege.
Which solution will meet these requirements?
- A. Share the dashboard from the CloudWatch consol
- B. Enter the product manager’s email address, and complete the sharing step
- C. Provide a shareable link for the dashboard to the product manager.
- D. Create an IAM user specifically for the product manage
- E. Attach the CloudWatch Read Only Access managed policy to the use
- F. Share the new login credential with the product manage
- G. Share the browser URL of the correct dashboard with the product manager.
- H. Create an IAM user for the company’s employees, Attach the View Only Access AWS managed policy to the IAM use
- I. Share the new login credentials with the product manage
- J. Ask the product manager to navigate to the CloudWatch console and locate the dashboard by name in the Dashboards section.
- K. Deploy a bastion server in a public subne
- L. When the product manager requires access to the dashboard, start the server and share the RDP credential
- M. On the bastion server, ensure that the browser is configured to open the dashboard URL with cached AWS credentials that have appropriate permissions to view the dashboard.
Answer: A
NEW QUESTION 17
A company hosts an application on AWS. The application uses AWS Lambda functions and stores data in Amazon DynamoDB tables. The Lambda functions are connected to a VPC that does not have internet access.
The traffic to access DynamoDB must not travel across the internet. The application must have write access to only specific DynamoDB tables.
Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)
- A. Attach a VPC endpoint policy for DynamoDB to allow write access to only the specific DynamoDB tables.
- B. Attach a security group to the interface VPC endpoint to allow write access to only the specific DynamoDB tables.
- C. Create a resource-based 1AM policy to grant write access to only the specific DynamoDB table
- D. Attach the policy to the DynamoDB tables.
- E. Create a gateway VPC endpoint for DynamoDB that is associated with the Lambda VP
- F. Ensure that the Lambda execution role can access the gateway VPC endpoint.
- G. Create an interface VPC endpoint for DynamoDB that is associated with the Lambda VP
- H. Ensure that the Lambda execution role can access the interface VPC endpoint.
Answer: AD
NEW QUESTION 18
......
Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From Thedumpscentre.com, Welcome to Download: https://www.thedumpscentre.com/SAA-C03-dumps/ (New 0 Q&As Version)