achievement, bar, business

Comparative Analysis and Best practices of AWS and GCP

Hi All,
As you all know, comparing Capital expenses (CapEx) to Operational Expenses
(OpEx) reveals the cloud is a great way to switch IT spending to a pay-as-you-go model and reduce huge CapEx costs, but it is always advisable to have good knowledge about the pricing mechanisms which cloud service providers follow. so in this article, i would like to share the pricing models of AWS and GCP but also free tier services and its limits in a tabular format.

I tried my best to keep the points in simple, understandable and bullet point manner wherever possible instead of longggg…. ;-)paragraphs.

Cloud Service providers like AWS, GCP and Azure helps you move faster, reduce IT costs, and attain global scale through abroad set of global compute, storage, database, analytics, application, and deployment services. It offers on-demand, pay-as-you-go, and reservation-based payment models, enabling you to obtain the best return on your investment for each specific use case.

Lets proceed with Amazon Web Services first, this section includes pricing models and free tier limits.

Understanding fundamentals of pricing
Fundamental drivers of Cost are Compute,Storage and Data Transfers.
1.For Data Transfer in most of the cases there is no charge for inbound data transfer or for data transfer between other AWS services within the same region
2.For Storage typically per GB
3.For Compute Resources it is typically for the duration like hours or seconds

Recommendations to start early with cost optimization
1.Right size your services
2.Save when you reserve
3.Use the spot Market
4.Monitor, track, and analyze your services usage
5.Analyze your costs and usage with Cost Explorer
6.Maximize the power of flexibility
7.Select right pricing Model for the right job like On-demand, Dedicated, Spot and Reserved Instances.

Free Tier Limitations

Pricing Model of Amazon EC2 Instances
1.
On-Demand
Depending on instance pay per hour or pay per seconds
No upfront commitment
Short term,spiky and unpredictable workloads
2.
Spot
It provides higher discounts
Applications have flexible start and end times
Users with urgent computing needs for a lot of Additional capacity.
3.
Reserved
It provides significant discount when compared to on-demand instances
Applications with study state usage
Applications that require reserve capacity
Customers who can commit to use ec2 for specific period of times
4.
Dedicated
It helps you to use existing server -bound software licenses
It is used to meet compliance requirements
These things need to be considered before estimating the costs
Instance Type
Number of instances
Load Balancing
Clock hours of server time
Detailed Monitoring
Auto-Scaling
Elastic IP address
Operating system and software packages

Pricing Model of Lambda
1.Pay only for the compute time you consume — there is no charge when your code is not running.
2.Charged based on number of requests for your functions and time it consumes.
Request PricingFree Tier : 1 million requests per month, 400,000 GB-seconds of compute time per month.
$0.20 per 1 million requests thereafter, or $0.0000002 per request.
Duration Pricing
400,000 GB-seconds per month free, up to 3.2 million seconds of compute time.
$0.00001667 for every GB-second used thereafter.

Pricing Model of Elastic Block StorageVolumes
Amount of GB you provision per month, until you release the storage.
Snapshots
It is based on the amount of space your data consumes in Amazon S3.
Data transfer
It is the amount of data transferred out of your application. Inbound data transfer is free, and outbound data transfer charges are tiered.

Pricing Model of Simple Storage ServiceStorage Class
Depends upon availability,redundancy and accessibility the storage selection and pricing may varies.
Storage
Costs vary with number and size of objects stored in your Amazon S3 buckets.
Requests
GET requests incur charges at different rates than other requests, such as PUT and COPY requests.
Data Transfer
The amount of data transferred out of the Amazon S3 region.

Pricing Model of Glacier
1.It provides long time data archiving and backup requirements
2.It provides query in-place functionality allowing you to directly run analytics on your archived data at rest.
Data retrieval policiesExpedited
1–5 mnts
$0.03 per GB
Standard
3–5 hrs
$0.01 per GB
Bulk
5–12 hrs
$0.0025 per GB
Analytics on Glacier data will consider total amount of data scanned,data returned and number of requests.

Pricing Model of Snowball
We need to pay the service fee per data transfer job and the cost of shipping the appliance.
Service fee per job
Snowball 50 TB: $200
Snowball 80 TB: $250
Extra-day charge
The first 10 days of onsite usage are free. Each extra onsite day is $15.
Data transfer
Data transfer in to Amazon S3 is free. Data transfer out of Amazon S3 is priced by region.

Pricing Model of Relational Database servicepricing factors
1.Database Characteristics like physical capacity.
2.Database Purchase type like on-demand or reserved.
3.Number of database instances.
4.Provisioned storage( no backup storage cost until the db instance is not terminated.
5.Number of input and output requests.
6.Deployment types like standalone or multi-availability zone.
7.Inbound data transfer is free and outbound data transfer costs are tiered.

Pricing Model of Dynamo-DBPricing for on-demand capacity mode
DynamoDB charges you for the data reads and writes your application performs on your tables.
On-demand capacity mode might be best if you:
Create new tables with unknown workloads.
Have unpredictable application traffic.
Prefer the ease of paying for only what you use.
Pricing for provisioned capacity mode
we need to specify the number of reads and writes per second that weexpect our application to require. we can use auto scaling to automatically adjust our table’s capacity based on the specified utilization rate to ensure application performance while reducing costs.
Provisioned capacity mode might be best if you:
Have predictable application traffic.
Run applications whose traffic is consistent or ramps gradually.
Can forecast capacity requirements to control costs.

Pricing Model of Cloudfront
Amazon Cloudfront charges are based on the data transfers and requests used to deliver content to your customers.
There are no upfront payments or fixed platform fees, no long-term commitments, no premiums for dynamic content, and no requirements for professional services to get started.
There is no charge for data transferred from AWS services such as Amazon S3 or Elastic Load Balancing.
Factors need to be consideredTraffic Distribution : Pricing depends on edge location from where the request is served
Requests : Number and type of requests
Data Transfer out : Amount of Data

Optimizing costs with reservations
Organizations can achieve significant cost savings by using Reserved Instances (RIs) and other reservation models for compute and data services
Payment Terms
No Upfront
Partial Upfront
All Upfront
1.Larger the upfront payment, the greater the discount.
2.The Reserved Instance Marketplace allows other AWS customers to list their Reserved Instances for sale.
3.If you can predict your need for Amazon DynamoDB read-and-write throughput, Reserved Capacity offers significant savings over the normal price of DynamoDB provisioned throughput capacity.
3.Amazon Elastic Cache Reserved Nodes give you the option to make a low, one-time payment for each cache node you want to reserve and, in turn, receive a significant discount on the hourly charge for that node.
4.All Reserved Instance types are available for Aurora, MySQL, MariaDB, PostgreSQL, Oracle, and SQL Server database engines.
5.If you intend to keep an Amazon Redshift cluster running continuously for a prolonged period, you should consider purchasing reserved-node offerings.

ResourcesAWS Simple Monthly Calculator
The AWS Simple Monthly Calculator helps you estimate your monthly bill.
AWS Architecture centre
The AWS Architecture Center provides you with the necessary guidance and best practices to build highly scalable and reliable applications in the AWS Cloud.
Cloud Economics Center
The Cloud Economics Center provides access to information, tools, and resources to compare the costs of Amazon Web Services with IT infrastructure alternatives.
Account
View your current charges and account activity, itemized by service and by usage type. Previous months’ billing statements are also available.
AWS Cost and Usage reports
Reports are available to download for each service. Specifying usage types, timeframe, service operations, and more can customize reports.

Lets proceed with Google Cloud Platform now

Google Cloud Pricing features
1.
Sustained Discount : upto 30% discount for workloads running most of the billing month.
2.
Committed Use discounts : upto 57% discounts for commiting to use a service for certain period of time.
3.
Pre-emptible VM’s : upto 78% discounts, it is similar to AWS Spot instances.
4.P
er-second billing : all services will be charged per-second billing.
5.
Cold-line storage : very cheap storage for archivel database.
6.
Custom machine types : users can able to customize cpu/ram as per application requirements.

Free-Tier Limitations

GCP Best practices
1.Ensure visibility using single pane of glass tool like stack driver monitoring for across the projects.
2.Ensure resource hierachy future in GCP account to granualte the access.
3.Optimize the access management using IAM Groups.
4.Managing firewalls to deny security issues.
5.Optimization Images life cycle rules to secure the compute environment.
6.Use Organizations and open a breadth of additional security features.
7.Implement Column Level Encryption for databases.
8.Use Oath 2.0 as an integrated authorization plan.
9. Whitelisting with Cloud IAP for access requests to VMs.
10.Implement Binary Authorization while using Kubernetes.
11.Use Cloud web security scanner before deploying the applications .
12.Enable flow Logs.
13.Segregate resources by projects.
14.Limit the use of Cloud IAM primitive roles.
15.Rotate Cloud IAM service account access keys periodically.
16.Ensure firewall rules are not overly permissive.
17.Ensure Cloud Storage buckets enforce appropriate access controls.
18.Ensure Cloud Storage buckets have logging and versioning enabled.
19.Create periodic snapshots of Compute Engine instances.
20.Create periodic backups of Cloud SQL instances.

AWS Best practices

1.Account Security Features Securing the access using MFA,Strong Password,password rotation policies and Access keys and Security Token service Authenticating compute instances using EC2 key pairs and X.509 certificates for enhanced security features
2.Ensure HTTPS access points
3.Security logs for debugging or real time alerting
4.Trusted Advisor for security checks and other features to optimize the infrastructure utilization
5.AWS Config for service configuration security.
6.Ensure that no S3 Buckets are publicly readable/writeable unless required by the business.
7.Turn on Redshift audit logging in order to support auditing and post-incident forensic investigations for a given database.
8.Encrypt data stored in EBS as an added layer of security.
9.Encrypt Amazon RDS as an added layer of security.
10.Enable require_ssl parameter in all Redshift clusters to minimize the risk of man-in-the-middle attack.
11.Restrict access to RDS instances to decrease the risk of malicious activities such as brute force attacks, SQL injections, or DoS attacks.
12.Inventory and categorize all existing custom applications deployed in AWS
13.Involve IT security teams throughout the application development lifecycle
14.Grant the fewest privileges possible for application users
15.Enforce a single set of data loss prevention policies
16.Encrypt highly sensitive data such as protected health information (PHI) or personally identifiable information (PII) using customer controlled keys.
17.Periodically move logs from the source to a log-processing system.
18.Run a configuration audit.
19.Don’t commit your access keys or credentials.
20.Leverage detective controls to identify potential security incidents.
21.Don’t forget to include your mobile apps in an audit.
22.Never use root access keys to request access through APIs or other common methods.
23.It’s your responsibility to apply the latest security patches to EC2 instances.
24. Scan your Git repositories and history for AWS keys.
25.Use private subnets with appropriate ACL’s for anything that doesn’t need to be public.
26.Eliminate blind spots.
27.Use granular permissions and versioning to protect data in S3 buckets.
28.Ensure that changes are properly verified and tracked.
29.Bundle native and third-party tools to create a secure AWS environment.
30.Identify, define, and categorize information assets.

Conclusion
The best way to estimate costs is to examine the fundamental characteristics for each AWS/GCP product,estimate your usage for each characteristic, and then map that usage to the prices posted on the website.

I hope this article might help you in your Cloud Architect Journey, AWS and GCP Logos used in this article are taken from the official website to make sure the content is more attractive and easily understandable to the audience.