- 66
- 72 120
AWS Cloud Bytes
United States
Приєднався 10 лют 2021
Welcome to the AWS Cloud Bytes. The channel aims to bring a weekly summary of new developments, features, and events in the AWS Cloud to keep everyone in sync with the upcoming changes. This channel aims at helping everyone gain a strong foundation in the AWS cloud platform. This channel will also feature sessions on understanding and solving AWS certification questions.
Please like, subscribe, and press the bell icon to get notifications for future videos.
Please like, subscribe, and press the bell icon to get notifications for future videos.
AWS VPC Endpoints for S3
This episode is about AWS VPC Endpoints and how we can use Gateway Endpoint to connect privately with S3
LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/
Personal Website: - www.bhaweshkumar.com/
Certifications: - www.youracclaim.com/users/bhawesh-kumar
Udemy profile: www.udemy.com/user/bhawesh-kumar
Instagram: - awscloudbytes
LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/
Personal Website: - www.bhaweshkumar.com/
Certifications: - www.youracclaim.com/users/bhawesh-kumar
Udemy profile: www.udemy.com/user/bhawesh-kumar
Instagram: - awscloudbytes
Переглядів: 59
Відео
AWS Elastic Kubernetes Service Deployment using Terraform
Переглядів 622 місяці тому
This episode is about AWS EKS deployment using Terraform LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - awscloudbytes
IAM - Identity & Access Management Policy
Переглядів 1036 місяців тому
This episode is about Amazon IAM Policy Types. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - awscloudbytes
Amazon S3 Cloud Storage: Scalable, Secure Data Storage
Переглядів 1147 місяців тому
This episode is about Amazon S3 Cloud Storage Service. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - awscloudbytes
All about AWS CloudTrail
Переглядів 958 місяців тому
This episode is about AWS CloudTrail. We discuss some details around CloudTrail followed by a quick demo of how it can be setup. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - awscloudbytes
Load data in a Redshift Cluster
Переглядів 7810 місяців тому
This episode discuss about importing data from S3 bucket into a Redshift Cluster table. We will also do the same activity with source as DynamoDB table. Overall the lab will cover two sources S3 bucket CSV file and the DynamoDB table to insert data in a Redshift Table. . LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracc...
Serverless in the Cloud - Unleashing the Power of AWS
Переглядів 12111 місяців тому
This episode discuss about the AWS serverless services. My mistake around Fargate- ECS has a Fargate and EC2 and not the other way around. I messed up while talking about it. We have ECS and EKS services in AWS and ECS has Fargate (serverless) and EC2 implementations. Apologies for lower resolution video, my regular Mac got issues due to which I have to use a new setup and it messed up the reco...
AWS S3 Set up Cross-Region Bucket Replication
Переглядів 382Рік тому
This lab will focus on setup of S3 cross-region replication. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - awscloudbytes
How to build and test a basic module using Terraform
Переглядів 158Рік тому
This lab will focus on building AWS EC2 instance using Terraform script with a basic VPC module. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - awscloudbytes
Create DynamoDB table with CloudFormation
Переглядів 626Рік тому
This lab will focus on using CloudFormation template to create a DynamoDB table, it will explain a CloudFormation template, create a stack using that template and verify all resources are created. CloudFormation Template is given in this repo: bitbucket.org/bhawesh-sample-projects/aws-cloudformation-demo/src/master/dynamodb-stack/dynamodb-template.yaml LinkedIn Profile: - www.linkedin.com/in/bh...
CloudFormation 101: Simplifying Infrastructure Deployment
Переглядів 149Рік тому
This lab will focus on CloudFormation 101, it will create a stack, update a stack and verify all resources are created. Once done we will delete the CloudFormation stack. CloudFormation Template is given in this repo: bitbucket.org/bhawesh-sample-projects/misc-scripts/src/master/cloudformation-stack.yml LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.c...
How to create and Assume Roles in AWS
Переглядів 699Рік тому
This lab will focus on creating a role and a couple of policies. Once created we will assign it to a users and play with these policies. The user will be able to Assume a Role. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - insta...
How to Create a Web Server with EC2
Переглядів 9 тис.Рік тому
This lab will focus on creating an EC2 instance and run an httpd Apache web server. We will then test this web server on http. We will have SSH access to the EC2 instance for installing web server. Please enjoy this lab and do suggest for future labs. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawe...
How to create and subscribe to an AWS SNS Topic
Переглядів 7 тис.Рік тому
This lab will focus on creating an SNS topic, Subscribing to this topic using an Email, creating a CloudWatch alarm, and testing how SNS activates alarm. LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkumar.com/ Certifications: - www.youracclaim.com/users/bhawesh-kumar Udemy profile: www.udemy.com/user/bhawesh-kumar Instagram: - awscloudbytes
AWS Chalice for building serverless apps in Python
Переглядів 2,2 тис.Рік тому
This lab will focus on what is AWS Chalice, how to create a chalice project, some key features of chalice, how to deploy it on AWS, Listening to DynamoDB stream using chalice built trigger and details on its documentation. Sample Code repository - bitbucket.org/bhawesh-sample-projects/aws-chalice/src/main/ LinkedIn Profile: - www.linkedin.com/in/bhaweshkumar/ Personal Website: - www.bhaweshkuma...
How to create and configure an AWS Web Application Firewall (WAF)
Переглядів 6 тис.Рік тому
How to create and configure an AWS Web Application Firewall (WAF)
How to create a VPC Endpoint and S3 Bucket in AWS
Переглядів 8 тис.Рік тому
How to create a VPC Endpoint and S3 Bucket in AWS
AWS Lab - How to Create and Configure a Network Load Balancer in AWS
Переглядів 2,3 тис.Рік тому
AWS Lab - How to Create and Configure a Network Load Balancer in AWS
AWS Lab - How to Build Serverless Application With Step Functions, API Gateway, Lambda and S3
Переглядів 457Рік тому
AWS Lab - How to Build Serverless Application With Step Functions, API Gateway, Lambda and S3
AWS Lab - How to Automatically Process Data in S3 using AWS Lambda
Переглядів 625Рік тому
AWS Lab - How to Automatically Process Data in S3 using AWS Lambda
AWS Lab - How to Trigger AWS Lambda from Amazon SQS (Publisher/Subscriber)
Переглядів 473Рік тому
AWS Lab - How to Trigger AWS Lambda from Amazon SQS (Publisher/Subscriber)
AWS Lab - How to build AWS VPC Flow Logs for Network Monitoring
Переглядів 2,3 тис.Рік тому
AWS Lab - How to build AWS VPC Flow Logs for Network Monitoring
AWS Lab - How to create a static website using Amazon S3
Переглядів 440Рік тому
AWS Lab - How to create a static website using Amazon S3
AWS News on File Cache, CloudWatch Alarms, PHP X-Ray Tracing, Cloud Control API, Amazon Workspaces
Переглядів 45Рік тому
AWS News on File Cache, CloudWatch Alarms, PHP X-Ray Tracing, Cloud Control API, Amazon Workspaces
AWS News on Elemental MediaLive, Fargate Compute, Panorama PrivateLink, SageMaker model training
Переглядів 33Рік тому
AWS News on Elemental MediaLive, Fargate Compute, Panorama PrivateLink, SageMaker model training
AWS News - CloudWatch Metrics Throughput, Lambda pricing, Glue Flex execution, DocumentDB Decimal128
Переглядів 702 роки тому
AWS News - CloudWatch Metrics Throughput, Lambda pricing, Glue Flex execution, DocumentDB Decimal128
AWS News - Lookout for metrics, FedRAMP, RAM, AWS Backup, Auto tuning, Centralized attendee control
Переглядів 652 роки тому
AWS News - Lookout for metrics, FedRAMP, RAM, AWS Backup, Auto tuning, Centralized attendee control
AWS News - Rekognition, Lex, ElastiCache, SAM CLI, Fargate, Trusted advisor
Переглядів 262 роки тому
AWS News - Rekognition, Lex, ElastiCache, SAM CLI, Fargate, Trusted advisor
AWS News on Amazon Chime SDK, Glue DataBrew, Lambda PrincipalOrgID
Переглядів 652 роки тому
AWS News on Amazon Chime SDK, Glue DataBrew, Lambda PrincipalOrgID
Very informative, clear , concise in context with DR
Glad it was helpful! thank you for your comment.
Hi sir, if you share references like the lab files through git, we can practice from our side.
Thanks for your comment, I would have loved to share it but it was a lab from A Cloud Guru and I don't have any access to the repo. I just used files that were available in the lab and didn't copied anything locally.
Awesome, Very informative lab
Thank you Santosh
Wow awesome video, great explanation, thank you sir!
Glad it was helpful! thank you for your comment.
Sir how to convert pdf
I didn't got your question Venky. Can you please elaborate, I will try to answer it. Thank you!
@@AWSCloudBytes sir after completed web server application how to convert pdf file
The PDF has to be served by the web server? if so then you will either build a library to write PDF and render from the web browser or source it through a content delivery location if its a static resource. Please let me know if this gives some insight else please write in detail what exactly are you trying to build.
@@AWSCloudBytes kk sir
Hi Sir, Good Morning, can enable VPC endpoints for S3 for cross accounts ? let say we have one internal account and multiple external accounts and now these external accounts sends the data to internal account, so can follow the VPC endpoint steps or over internet only possible ? Please help
This post will help you setup cross account access without going through public internet. repost.aws/questions/QUWEuKonUtSye3lnbO9cFuZw/cross-account-s3-access-without-going-over-internet
UA-cam is not showing my reply so reposting it here. Look at the post in the link below : - repost.aws/questions/QUWEuKonUtSye3lnbO9cFuZw/cross-account-s3-access-without-going-over-internet
@@AWSCloudBytes Thank you So much , this is great help to me. my case is one internal account and multiple external accounts will transfer the data to this single internal account. so for each external account i should go with separate VPC endpoint correct ?
Apologies Balaji for late response, UA-cam hides replied comments so if there is an existing comment with a new response it is not visible. To answer your question yes it will be a good approach, it will give you fine grained access control, isolation and security, and you can do performance optimizations. The only caveat is the cost and maintenance. I will suggest to check for VPC endpoint charges.
Everything is good, but unnecessary background music
Thank you for your feedback. We have removed background music from all recent video's
Summary: Disaster recovery involves preparing for and recovering from events to ensure workload functionality. Resiliency, the ability to recover from disruptions, is crucial, with availability being another key component. Disaster recovery addresses various events like natural disasters, technical failures, or human errors. Objectives include assessing business impacts, identifying customer impacts, and defining Recovery Point Objectives (RPO) and Recovery Time Objectives (RTO). RPO determines data loss tolerance, while RTO sets acceptable downtime. Key considerations for disaster recovery include multi-AZ versus multi-region strategies. AWS offers four disaster recovery strategies: Backup and Restore, Pilot Light, Warm Standby, and Multi-Site or Active-Active. - Backup and Restore involves data backups stored in S3, with potential data loss and longer recovery times. - Pilot Light maintains a minimal setup in the disaster recovery region for faster recovery, at a higher cost. - Warm Standby maintains some compute instances in the disaster recovery region for quicker recovery, at a higher cost than Pilot Light. - Multi-Site (Active/Active) keeps both production and disaster recovery regions active, with real-time data replication and near-zero data loss, but at the highest cost.
Excellent content and explanation!
Thanks Rahul
Web page is not opening for me even after adding the security group and HTTP
I am sorry to hear that, I will suggest to check the following: - - Make sure that the EC2 instance is running and accessible. You can check the instance status in the AWS Management Console or by using AWS CLI commands. - Ensure that the security group associated with your EC2 instance allows inbound traffic on port 80 (HTTP) from the necessary IP addresses or ranges. - Double-check that the security group settings are correctly applied to the EC2 instance. - Double-check that the security group and network ACL associated with your EC2 instance allow inbound traffic on port 80. - Check the web server status using the command "sudo systemctl status httpd" is active (running). If its not running start it using "sudo systemctl start httpd" - Login to EC2 using ssh and try it locally using - curl localhost - Review the httpd logs, The main configuration file is usually located at /etc/httpd/conf/httpd.conf, and additional configurations may be in files under /etc/httpd/conf.d/. Check these files for any errors or misconfigurations. - Ensure that the firewall (iptables) on your EC2 instance allows inbound traffic on port 80. You can check and modify firewall rules using the iptables command or by using a firewall management tool. - Check Apache's access log (/var/log/httpd/access_log) for any requests coming to your server. If requests are reaching your server, it indicates that the issue might be with Apache's configuration or the website files. - Inspect Apache's error log (/var/log/httpd/error_log) for any error messages. These messages can provide valuable insights into what might be causing the issue. Hopefully you should be able to resolve the issue.
I believe you are using http to access the public IP and not https as we haven't used https in this example
How can I add/remove IP addresses on a running basis? Meaning I have a live database that determines if an IP address should be added or removed, without doing it manually?
I don't have the real scenario but I have the following that can be done to dynamically add or remove IP addresses in AWS WAF (Web Application Firewall) based on conditions determined by a live database, you can use AWS WAF API and Lambda functions. Here's a high-level overview of the process: Lambda Function Setup: Create two Lambda functions: one for adding IP addresses and another for removing them. You can do with one also. Write code in these Lambda functions to interact with the AWS WAF API. AWS WAF Rules and IP Sets: Set up the AWS WAF rules and IP sets that define the conditions for adding or removing IP addresses. Create an IP set for the allowed or blocked IP addresses. Lambda Function Execution: Trigger the Lambda functions based on events from your live database. For example, you might use AWS CloudWatch Events to trigger Lambda when specific database events occur. Lambda Code to Interact with AWS WAF: In the Lambda functions, use the AWS SDK or AWS CLI to interact with the AWS WAF API. To add an IP address, use the putIPSet API to update the IP set with the new address. To remove an IP address, use the updateIPSet API to remove the address from the IP set. Here's a simplified example using Python and AWS SDK (Boto3) in a Lambda function: import boto3 def add_ip_to_waf(ip_address): waf_client = boto3.client('waf-regional') # Use 'wafv2' for WAFv2 ip_set_id = 'your_ip_set_id' response = waf_client.get_ip_set(IPSetId=ip_set_id) ip_set = response['IPSet'] # Add the new IP address ip_set['IPSetDescriptors'].append({'Value': ip_address}) # Update the IP set waf_client.update_ip_set(IPSetId=ip_set_id, ChangeToken=response['ChangeToken'], Updates=[{'Action': 'INSERT', 'IPSetDescriptor': {'Type': 'IPV4', 'Value': ip_address}}]) def remove_ip_from_waf(ip_address): waf_client = boto3.client('waf-regional') # Use 'wafv2' for WAFv2 ip_set_id = 'your_ip_set_id' response = waf_client.get_ip_set(IPSetId=ip_set_id) ip_set = response['IPSet'] # Remove the IP address ip_set['IPSetDescriptors'] = [desc for desc in ip_set['IPSetDescriptors'] if desc['Value'] != ip_address] # Update the IP set waf_client.update_ip_set(IPSetId=ip_set_id, ChangeToken=response['ChangeToken'], Updates=[{'Action': 'DELETE', 'IPSetDescriptor': {'Type': 'IPV4', 'Value': ip_address}}]) Make sure to replace 'your_ip_set_id' with the actual ID of your IP set. This is a basic example, and you may need to adapt it based on your specific use case and the version of AWS WAF you are using (WAF or WAFv2). Additionally, ensure that your Lambda functions have the necessary IAM permissions to interact with AWS WAF.
Can I know what is the pricing of creating sns topic n subscription for this topic where the protocol is email json to revive bounce notifications
Creating an SNS topic is typically free, and charges are based on factors like the number of messages published and delivered to subscribed endpoints. Subscribing an endpoint (e.g., email) and handling bounce notifications may incur charges. The below given AWS page can provide you all pricing info, you can also use calculator.aws/ to add the service and its features to get an estimate. aws.amazon.com/sns/pricing/
hello i decided to implement disater recovery in aws cloud project for my cyber security Post graduation in CDAC i need help to implement the diagram of desaster recovery diagram but i need the steps to create this diagrm from starting plzz provide mi the steps links to implement this diagram properly like as ec2,vpc,load balancing creation so i need help how to implement this tools and see the last output of diagramattly implementation.
Hi Sayali, I didn't understood your requirement. Do you need an architectural diagram to depict DR in your organization? I can certainly help you in building it, I have used tools such as Gliffy (paid version), Draw.Io (free), Visio (paid), etc. I will suggest use Draw.io to depict your AWS regions, Route53, resources in both regions under VPC public and private subnets, EC2 instances in both regions public or private subnet as per your architecture.
how can i apply this firewall to my own website on localhost?
If you want to simulate the use of AWS Web Application Firewall (WAF) for your website on localhost (which is your local development environment), you won't be able to use AWS WAF directly since it's a cloud-based service. However, you can emulate certain aspects of web application security using local tools. Here's a simplified guide to set up a basic firewall for your local development environment: Use a Local Firewall Tool: On Windows, you can use the built-in Windows Defender Firewall or third-party tools like ZoneAlarm. On macOS, you can use the built-in PF firewall or a third-party tool like Little Snitch. On Linux, you might use iptables or a higher-level tool like UFW. Define Firewall Rules: Create rules to allow and deny traffic based on specific criteria. For example, you might block certain IP addresses, limit the rate of requests, or filter out requests with specific patterns. Web Application Security Best Practices: Embrace secure coding practices in your application to prevent common vulnerabilities (SQL injection, XSS, etc.). Consider using security headers like Content Security Policy (CSP) to mitigate cross-site scripting attacks. Testing: Regularly test your application for security vulnerabilities using tools like OWASP ZAP, Burp Suite, or others. Simulate different types of attacks to ensure that your security measures are effective. Log Analysis: Implement logging in your application and analyze logs regularly. Look for unusual patterns or potential security incidents. Remember, while this local setup can help you test some security measures, it's not a substitute for a comprehensive cloud-based WAF solution like AWS WAF when your application is deployed in a production environment. When deploying your application to a cloud environment like AWS, you can configure and use AWS WAF to provide protection at the application layer.
If your web application is running on an EC2 instance and you want to implement some form of a firewall or security measures similar to AWS WAF, you can use a combination of local firewall settings on your EC2 instance, security groups, and potentially third-party security solutions. Here's a general guide: Local Firewall on EC2: Security Groups: Leverage AWS Security Groups, which act as a virtual firewall for your EC2 instances. Configure inbound and outbound rules based on your application's needs. For example, you can restrict inbound traffic to only allow HTTP (port 80) or HTTPS (port 443) traffic. Network ACLs: Additionally, consider using Network Access Control Lists (NACLs) at the subnet level for another layer of security. NACLs operate at the subnet level and can provide additional control over inbound and outbound traffic. Web Server Configuration: Secure your web server configuration. Disable unnecessary services and features. Ensure that your web server is configured to handle common security headers appropriately. Third-Party Solutions: Web Application Firewalls (WAF): Consider implementing a local WAF solution on your EC2 instance. There are various WAF solutions available that can provide protection against common web application attacks. Examples include ModSecurity, which can be integrated with Apache or Nginx. Intrusion Detection/Prevention Systems (IDS/IPS): Implement an IDS or IPS to monitor and prevent potentially malicious activity. Tools like Snort can be configured to analyze network traffic and detect suspicious patterns. Monitoring and Logging: CloudWatch Logs: Use CloudWatch Logs to collect logs from your EC2 instance. Monitor these logs for signs of security incidents. Security Auditing: Regularly audit your server and application for security vulnerabilities. Tools like Amazon Inspector can assist with automated security assessments.
@@AWSCloudBytes Thank you so much
How the s3 showed up in public instance? While creating the s3 bucket it blocked all access right?
To access the S3 from public EC2 instance, the following things will be required. You are right the S3 bucket created in lab had blocked all public access, but not access from a trusted entity. Accessing Amazon S3 from a public jump box in AWS involves a few steps. Here's a general guide: 1. Jump Box/Public EC2 Setup: - Launch an EC2 instance to serve as your jump box in a public subnet. - Ensure that the security group associated with the jump box allows inbound SSH traffic (port 22) from your IP address or a specific range of IP addresses. 2. IAM Role Setup: - Create an IAM role that has the necessary permissions to access the S3 bucket. Attach policies like `AmazonS3ReadOnlyAccess` if you only need read access. Customize the policies based on your requirements. - While creating the IAM role, make sure to specify the EC2 instance as the trusted entity that can assume this role. 3. Attach IAM Role to the Jump Box: - Attach the IAM role you created to your EC2 instance (jump box). You can do this during the instance launch or by modifying the instance details. 4. SSH into the Jump Box: - Connect to your jump box using SSH. Use the private key associated with the key pair when launching the EC2 instance. 5. Configure AWS CLI: - Install the AWS Command Line Interface (AWS CLI) on your jump box if it's not already installed. - Run `aws configure` and enter the IAM user's access key, secret key, default region, and output format. 6. Access S3 from the Jump Box: - Once the IAM role is attached and the AWS CLI is configured, you can use AWS CLI commands to interact with S3. For example: aws s3 ls s3://your-bucket-name It's a good practice to use IAM roles with the principle of least privilege, granting only the permissions necessary for the task at hand. If you need write access or more specific permissions, adjust the IAM policies accordingly. Also, consider setting up AWS Systems Manager Session Manager as an alternative to direct SSH access, as it provides a secure and auditable way to access your EC2 instances without exposing SSH ports to the internet.
@@AWSCloudBytes thanks, you could have simply said the policy is attached to the public ec2 instance.
Good one but your tone is more like a person who has been asked by a teacher in class to recite a poem forcefully and he is doing it with little interest, try to induce some energy qnd enthusiam while talking as your content is good.
Thank you Gaurav for your input, I will keep that in mind for future videos.
bhai padhao... toh dhang se padhao...accent main bolne se nahi hota hai
Thanks for your comment Managalam. Please let me know what is missed or you wanted to be covered.
Awesome... thanks for putting effort
Thank you Binod, I am happy that it is helpful.
helped tysm
You are welcome, thank you.
In which software we need to create this firewall
If I understand your question correctly, you are trying to check what all services can use WAF? AWS WAF can be deployed on Amazon CloudFront, the Application Load Balancer (ALB), Amazon API Gateway, and AWS AppSync. The WAF itself is an AWS Cloud service.
This is an awesome 😎💯 right pathway channel You're doing great job of sharing back to the community 👍
Thank you for your appreciation. I am happy that it is useful
Thanks sir, it really hepled. Please make more such videos on AWS Services !
Thank you for your appreciation. I will post more soon.
Reduce the volume of the background music
Thank you for your feedback, I will do that for the future videos.
Wonderful.! thank you so much.!
Glad you liked it!
Hi how to create private ec2 without public ip
To create a private EC2 instance without a public IP in AWS, you can follow these steps: Launch a VPC (Virtual Private Cloud): If you don't have a VPC already, create a new one. Ensure that the VPC has private subnets (subnets without internet gateways) where you want to deploy your private EC2 instance. Create a Security Group: Create a security group that allows inbound and outbound traffic as per your requirements. This security group will be associated with your private EC2 instance. Launch the EC2 instance: During the EC2 instance launch process, you need to select the appropriate VPC and subnet that you want to place the instance in. Choose an Amazon Machine Image (AMI): Select the desired operating system and software configuration for your instance. Configure Instance Details: In the "Configure Instance Details" section, you need to: Choose the private subnet from the "Network" dropdown. Optionally, you can specify a private IP address or let AWS assign one from the subnet's IP range. Add Storage: Configure the instance storage as per your requirements. Add Tags (optional): Assign tags to the instance for better management and organization. Configure Security Group: Select the security group you created earlier to associate with this instance. Review: Review your settings and make sure everything is correct. Launch the instance: Finally, click "Launch" to create the private EC2 instance without a public IP. After launching the private EC2 instance, it will not have a public IP address and will be accessible only from within the private subnet or through other resources within the VPC with appropriate access permissions. If you need to access the instance, you can use bastion hosts or VPN solutions to connect securely to the private subnet.
Wow the video did helped me👏🏿
thank you for you input, glad it was helpful.
multi side active / active both server are read write ??
it depends on your setup, there are three patterns usually. Example Read from your region but write to only one single region in two or more region setup. If you are in India and have app in region Asia Pacific you read from that but your other region may be Europe. If you make it write in that and then allow all regions sync for reads. This is how your general Aurora RDS will work. If you have multiple regions and each allow write and syncs each other this is how DynamoDB Global tables work. The third DR scenario is when you have profiled a user in a region and he travels to another region the write still goes to his initial region this is called read local and write local. Hope this helps.
Thanks for the video, what AMI did you select for your test web servers?
I used AmazonLinux2 AMI for the web servers
Awesome
Thank you Biswajit
This is soo smooth. Thank you so much for sharing your knowledge. You IT content creators on UA-cam really are modern heroes.
Thank you Zigs for kind words.
Nice demonstration but would be much easy to understand if you have shown one simple architecture diagram at the time of beginning the demo configuration..
Thanks Chandrashekar, I will try to keep this in mind for the future videos.
How to enter into new user from root user if that was explained clearly 🎉🎉🎉❤❤
If I understood it correctly you want to change user from root to a new user. In order to do so you will need the following setup. Create a new user account from the root user in Linux, follow these steps: Open a terminal or log in to the Linux system as the root user. Use the adduser or useradd command to create a new user account. For example, to create a user named "newuser", use the following command - adduser newuser or use useradd newuser Then set a password for the new user by running the passwd command followed by the username. You will be prompted to enter and confirm the password. - passwd newuser Set name if required - usermod -c "Full Name" newuser set group if required - usermod -aG groupname newuser Replace "groupname" in the example with the actual group name. change root user to new user by su - newuser Use whoami command to verify I hope this helps, else ping me with additional details
Thank you bro, you saved my time and energy❤️❤️❤️
Good to hear that it was helpful, Thank you Shaik
it would be helpful if you share the athena query
Hey Priya, I have updated description to have CloudWatch filter pattern, Athena Table and Athena Query. SELECT day_of_week(from_iso8601_timestamp(dt)) AS day, dt, interfaceid, sourceaddress, destinationport, action, protocol FROM vpc_flow_logs WHERE action = 'REJECT' AND protocol = 6 order by sourceaddress LIMIT 100;
which passward he want ?
To SSH into an EC2 instance without a password, you can use SSH key-based authentication. Here's how to set it up: Create a key pair on your local machine: ssh-keygen -t rsa This will create a public and private key pair in ~/.ssh/id_rsa and ~/.ssh/id_rsa.pub. Log in to your EC2 instance using the public key you just generated: ssh -i ~/.ssh/id_rsa.pub ec2-user@<your-instance-public-ip> Once logged in, create a .ssh directory if it doesn't exist: mkdir -p ~/.ssh Copy the public key from your local machine to the EC2 instance: scp ~/.ssh/id_rsa.pub ec2-user@<your-instance-public-ip>:~/.ssh/authorized_keys Set the correct permissions on the authorized_keys file: chmod 600 ~/.ssh/authorized_keys Log out of the instance. Log back in to the instance using your private key: ssh -i ~/.ssh/id_rsa ec2-user@<your-instance-public-ip> You should now be able to SSH into your EC2 instance without a password using the private key. ======== To use Password=========== By default, EC2 instances on Amazon Web Services (AWS) are not set up with a password for logging in as the root user via SSH. Instead, you would typically use SSH key-based authentication to log in securely. However, if you need to set a password for the root user, you can follow these steps: Log in to your EC2 instance using SSH key-based authentication. Once logged in, run the sudo passwd root command to set a password for the root user. Enter a new password when prompted, then confirm it. Log out of the instance. Log back in to the instance using the root username and the new password you just set. Note that setting a password for the root user can be a security risk if not done properly, as it opens up the possibility of brute-force attacks and other security issues. It is generally recommended to use SSH key-based authentication instead for increased security.
Thank you very much sir, I was very informative
Glad it helped
Even after updating the ip set the page with waf test is not appearing please help
Thanks for writing Sashi, let me check on it, I think you may have missed on some step.
Hey Shashi, Are you able to get your IP correctly? Once you have it in the set, are you selecting allow for the rule action? I may be able to help if you can give me some details of what steps you are following.
@AWS Cloud Bytes I have re given the correct ip and I have given 2rules used by you. I doubt that does region effect the working
Now I am getting error 403 forbidden
You have made an excellent description and it was quite helpful, even if I know AWS services. I will subscribe to add knowledge from your videos.
Glad it was helpful! Thank you Vimal
can you tell how you access in private instance through terminal because i cant do that and from where you bring that password and how you created it when i tried same as your command ssh cloud_user@public ip it shows permission denied
Apologies for late response. In order to reach private instance you will have to first ssh to the public facing instance. This is also called jump box or bastion host. Once you ssh to this public ec2 instance you can then use the private IP of the ec2 in private subnet to ssh. As you have ssh on public instance you are in the network due to which access to private instance is possible.
Thanks for your videos, they're great! I have a question. Why use a NAT gateway when you already have a public route table for the public subnet?
NAT Gateway is used by private instance for communicating to internet, you may be running software's on private subnet EC2 instance which will need updates. The safest way to get it using NAT gateway, which will allow traffic to flow from your VPC to internet and get the responses back but it will not allow traffic to be initiated from the internet to these machines. Public subnet EC2 is just an example of a Jump Box which doesn't need NAT Gateway for any public internet access as its route table is already having a route to internet gateway. Any one having SSH credentials if SSH is enabled on the EC2 instance and routes are open for incoming IP address can perform an SSH on EC2 instance. Generally you lock down IPs that can perform SSH request to your public machines. I hope this helps.
@@AWSCloudBytes Thanks for the response. I appreciate it.
I think most of people are looking for short useful videos about AWS LABs similar to this one. Well done. Im waiting for the next video 😅
Thank you Alaa, I will get next one soon.
What software are you using to create the network diagram? Thanks for a great video!
its open source Draw.io, you can download desktop version or use it from the website itself.
thanks for the information
Always welcome, Thanks Simo
Thanks a lot
Thanks Satish
Great video! Do you have the code for this posted somewhere? Github etc.
Thanks for your comment, I have uploaded code in a git repo and added it in the video description. Please find the link below for your reference. github.com/bhaweshkumar/demo-code/tree/main/serverless-app
LIKE 👍👍👍👍👍👍👍 💕 💕 💕💕 🤩🤩🤩🤩
Thank you
What about on premise to cloud disaster recovery
It is a good question Suhail. We can perform a sync to cloud using CloudEndure Disaster Recovery. A continuous replication or snapshots can be done as per the RTO and RPO needs of the organization. This URL from AWS provides good amount of details about on-prem DR. docs.aws.amazon.com/prescriptive-guidance/latest/backup-recovery/on-prem-dr-to-aws.html
@@AWSCloudBytes thanks for the reply
Hi AWS Cloud Byte, I have one query does AWS recovery provide versioning of data files? Why I am asking this let assume if Ransomware attacks production as well as DR site then how could I recover from that situation
aws.amazon.com/cloudendure-disaster-recovery/ransomware_recovery/ Hope this helps in your query: - No industry is immune to ransomware attacks. While there are different forms of ransomware, the most common one involves locking or encrypting a person or company’s data, and then demanding a ransom to restore access. AWS offers CloudEndure Disaster Recovery, which can be used for ransomware recovery. CloudEndure Disaster Recovery can launch unlocked and unencrypted versions of your servers from before the ransomware attack into your preferred AWS Region. This point-in-time recovery capability protects your data and enables you to be back up and running in minutes after a ransomware attack - without having to pay ransom.
Nice one .. expecting more from you ..
Thank you, sure Mukesh, I was on vacation, you can expect more in coming week or so.
Hi
Hi Sukhanth