Week 1: Introduction to Amazon Web Services

一些AWS的笔记,说是week 1,其实是day 1

First of all, this is the AWS Guide 😁

AWS Cloud Practitioner Essentials AWS Training and Certification

Testing the way to insert images in Markdown!


amazon elastic compute cloud

amazon ec2 (因为c出现了2次)

The on-demand delivery of IT resources over the internet with pay-as-you-go pricing

The three cloud computing deployment models are cloud-based, on-premises(Private cloud deployment), and hybrid.

私有云部署旨在提供类似于公共云的灵活性和资源共享,但在私有基础设施中实现。它是在组织自己的数据中心或基础设施上创建和管理云计算环境。

  1. For example, a company might create an application consisting of virtual servers, databases, and networking components that are fully based in the cloud.
  2. For example, you might have applications that run on technology that is fully kept in your on-premises data center. Though this model is much like legacy IT infrastructure, its incorporation of application management and virtualization technologies helps to increase resource utilization.
  3. For example, suppose that a company wants to use cloud services that can automate batch data processing and analytics. However, the company has several legacy applications that are more suitable on premises and will not be migrated to the cloud.

Benefits of Cloud Computing:

Trade upfront expense for variable expense
Stop spending money to run and maintain data centers
Stop guessing capacity
Benefit from massive economies of scale
Increase speed and agility
Go global in minutes

”The aggregated cloud usage from a large number of customers results in lower pay-as-you-go prices.” This answer describes how customers can benefit from massive economies of scale in cloud computing. The other response options are incorrect because:
Not having to invest in technology resources before using them relates to Trade upfront expense for variable expense.
Accessing services on-demand to prevent excess or limited capacity relates to Stop guessing capacity.
Quickly deploying applications to customers and providing them with low latency relates to Go global in minutes.

You can give that instance more memory and more CPU. Which is what we call vertically scaling an instance

Scalability:

If you wanted the scaling process to happen automatically, which AWS service would you use? The AWS service that provides this functionality for Amazon EC2 instances is Amazon EC2 Auto Scaling.

Directing Traffic with Elastic Load Balancing:

Elastic Load Balancing is a regional construct
Now, if the back end scales, once the new instance is ready, it just tells the ELB that it can take traffic, and it gets to work. The front end doesn’t know and doesn’t care how many back end instances are running. This is true decoupled architecture.

Messaging and Queueing:

Amazon Simple Queue Service (Amazon SQS) and Amazon Simple Notification Service (Amazon SNS).
The data contained within a message is called a payload, and it’s protected until delivery. SQS queues are where messages are placed until they are processed.
Publishing updates from multiple topics

Monolithic Applications and Microservices:

  • Monolithic Applications:
    Applications are made of multiple components. The components communicate with each other to transmit data, fulfill requests, and keep the application running.
    Suppose that you have an application with tightly coupled components. These components might include databases, servers, the user interface, business logic, and so on. This type of architecture can be considered a monolithic application.
    In this approach to application architecture, if a single component fails, other components fail, and possibly the entire application fails.
  • Microservices:
    To help maintain application availability when a single component fails.
    you can take a microservices approach with services and components that fulfill different functions. Two services facilitate application integration:SNS,SQS

“AWS Lambda” 和 “Docker”

都是云计算和应用程序部署领域中的两种不同的技术和概念。让我们分别解释一下它们是什么以及它们之间的区别:

AWS Lambda:
AWS Lambda 是亚马逊 Web 服务(Amazon Web Services,AWS)提供的一项无服务器计算服务。它允许您在云中运行代码,而无需管理底层的服务器或基础设施。您可以将函数代码上传到 Lambda,然后定义何时触发该函数(例如,通过 API 调用、事件触发等)。Lambda 函数的计算资源会根据实际请求进行自动扩缩,您只需为实际执行的代码时间付费,没有固定的实例或虚拟机概念。Lambda 可以与其他 AWS 服务集成,使您能够构建响应性强、高度可扩展的应用程序。

Docker:
Docker 是一种容器化技术,用于在不同的环境中打包、分发和运行应用程序及其所有依赖项。通过 Docker,您可以创建轻量级、可移植的容器,这些容器包含了应用程序、运行时、库和其他系统工具,以便在任何支持 Docker 的系统上运行。Docker 容器隔离了应用程序和其它容器以及主机系统之间的依赖关系,使得应用程序在不同环境中具有一致的行为。

区别:

  • 计算模型: AWS Lambda 是一种无服务器计算模型,它处理事件触发的函数。Docker 是一种容器化技术,允许您将整个应用程序及其依赖项封装在容器中。

  • 部署和管理: 在 AWS Lambda 中,您只需上传函数代码并配置触发器,无需关心底层的服务器管理。而 Docker 需要您管理容器的构建、部署和运行。

  • 资源管理: AWS Lambda 自动处理计算资源的扩缩,根据实际负载调整。Docker 需要您自己管理容器的资源分配和调度。

  • 弹性和成本: AWS Lambda 根据函数的实际执行时间付费,适用于短期、轻量级的任务。Docker 需要您为底层虚拟机或主机的资源支付成本,适用于长期运行或较重的应用。

  • 灵活性: Docker 提供更大的灵活性,因为您可以构建、配置和管理整个容器环境。AWS Lambda 适用于特定类型的事件触发函数。

总之,AWS Lambda 和 Docker 都有自己的优势和适用场景。选择哪种技术取决于您的应用程序需求、团队技能和预算等因素。在某些情况下,甚至可以将这两种技术结合使用,例如在 Docker 容器中运行 Lambda 函数。

When you use Docker containers on AWS, you need processes to start, stop, restart, and monitor containers running across not just one EC2 instance, but a number of them together which is called a cluster. The process of doing these tasks is called container orchestration and it turns out it’s really hard to do on your own. Orchestration tools were created to help you manage your containers.


The video provides an overview of different compute options available in Amazon Web Services (AWS) and their respective use cases and management approaches. Here’s a breakdown of the key points:

  • EC2 Instances: EC2 instances are virtual machines that can be quickly provisioned and managed on AWS. They are suitable for various use cases, from basic web servers to high-performance computing clusters. While EC2 is flexible, reliable, and scalable, depending on your needs, you might consider alternatives for your compute capacity. Using EC2 requires setting up and managing a group of instances, and responsibilities include patching, scaling, and ensuring high availability.

  • Serverless Computing: AWS offers several serverless compute options. “Serverless” means you don’t directly access the underlying infrastructure or instances; AWS handles provisioning, scaling, high availability, and maintenance. You can focus solely on your application.

  • AWS Lambda: AWS Lambda is one such serverless compute option. You upload your code to a Lambda function, configure a trigger, and when the trigger activates, the code runs automatically in a managed environment. Lambda environments automatically scale and are highly available, suitable for short-term tasks like web backends and request handling, each lasting less than 15 minutes.

  • Containerized Services: For more control while maintaining efficiency and portability, consider containerized services. Amazon ECS and Amazon EKS are container orchestration tools, managing Docker containers running on EC2 instances. AWS Fargate is a serverless compute platform for containerized applications, eliminating the need to manage EC2 instances.

In conclusion, within AWS, you can choose between traditional EC2 instances, serverless computing (like AWS Lambda), or containerized services (such as Amazon ECS, Amazon EKS, and AWS Fargate) based on your requirements. Each option offers varying levels of management and abstraction to cater to different application and workload needs.

Serverless Computing:

The term “serverless” means that your code runs on servers, but you do not need to provision or manage these servers. With serverless computing, you can focus more on innovating new products and features instead of maintaining servers.

Summary:

In this check-in, the course reviews key concepts learned so far. Cloud computing involves on-demand IT resource delivery over the internet with pay-as-you-go pricing. AWS provides various categories of services that work together to build solutions.
Amazon EC2 was discussed, enabling dynamic provisioning and termination of virtual servers called instances. Instance families determine hardware, with options for general purpose, compute optimized, memory optimized, accelerated computing, and storage optimized. Scaling can be vertical or horizontal, using Amazon EC2 Auto Scaling for automated horizontal scaling. Elastic Load Balancer distributes traffic across instances.
EC2 instance pricing includes On-Demand, spot pricing for discounted unused capacity, and Reserved Instances/Savings Plans for commitment-based discounts. Messaging services were covered, including Amazon SQS for decoupling components and Amazon SNS for sending messages to subscribers.
Beyond virtual servers, AWS offers diverse compute services. Container orchestration tools include Amazon ECS and Amazon EKS, which can be used with EC2 instances or AWS Fargate, a serverless compute platform. AWS Lambda allows uploading code and triggering its execution, charging only for runtime.