Edge Locations – Introduction to AWS – SOA-C02 Study Guide

Edge Locations

Edge locations are globally distributed locations that provide an additional latency reduction when delivering applications from AWS. Most of the capacity in an edge location is dedicated to CloudFront, a content delivery network that can help deliver static content with fast response times, usually in the double-digit millisecond range. You can also terminate connections and even return dynamic responses to users using the Lambda@Edge processing functions at edge locations. These allow you to implement authentication, verification, and other features such as detection of user agents and browser types.

To provide the lowest latency for the DNS service, all of the Route 53 servers are deployed in all edge locations across the globe. This vast distribution allows the Route 53 service to be highly resilient and allows AWS to promise a 100 percent SLA on the Route 53 service. The API Gateway can be integrated at the edge location to provide lower latencies for API calls, which can have a tremendous impact on the performance of a global application. Security services like AWS Shield and WAF being deployed at the edge location can also greatly increase the resilience of applications.

Accessing AWS

All management calls to all services in AWS are API calls. This allows both humans and machines to seamlessly access AWS services. AWS also provides access tools that simplify how you access the environment. The simplest way to access AWS is via the AWS Management Console, and most of the exam also focuses on practical examples in the console. However, if you would like to perform some custom calls and automate some interaction with AWS, you can always choose to use the AWS command-line interface. The CLI enables you to run command-line calls and scripts (including Bash, PowerShell, and batch) from Windows, Linux, and Mac. The CLI is built on the AWS boto3 SDK (the Python SDK); however, other AWS software development kits (SDKs) are available for multiple other programming languages. The SDKs enable developers to interact with AWS services directly from the source code.

Cram Quiz

Answer these questions. The answers follow the last question. If you cannot answer these questions correctly, consider reading this section again until you can.

1. A Linux administrator well versed in Bash scripting asks you to help select the right tool for AWS automation deployment. Which of the following tools would you recommend to automate an AWS infrastructure deployment from within a Linux operating system? Select the simplest solution.

A. AWS CLI

B. AWS SDK

C. CloudFormation

D. API calls to the management console

2. Which steps would you need to take to make an application deployed on EC2 highly available within a region? (Choose all that apply.)

A. Deploy two instances in one availability zone.

B. Deploy two instances in two different availability zones.

C. Ensure the application data and state are synchronized between the instances.

D. None of these answers are correct. EC2 is inherently highly available in a region.

Cram Quiz Answers

1. Answer: A. The simplest solution for automating deployments from an existing Linux environment would be the AWS CLI. Anyone familiar with Bash scripting would be quick to pick up the syntax and able to automate an AWS deployment with ease.

2. Answer: B and C. To enable a service like EC2 to run a highly available application in AWS, you need to deploy two instances in two different availability zones. Additionally, you need to set up the application layer in a way that will ensure the application data and state are synchronized between the two instances. Deploying two instances in one availability zone ensures that high availability of the instance cannot guarantee resilience at the availability zone level.