- 180
- 230 715
sumit kumar
India
Приєднався 15 жов 2016
Hi Everyone,
You will find many AWS real time scenario use case in my channel.
I will keep creating video related to AWS services, Python or any random technical videos that will surely help you.
You can let me know in comment section if you want me to create video on AWS/Python
connect with me on LinkedIn:-
www.linkedin.com/in/sumit-kumar-877777ba/
Mobile Number :- +918147085086
You will find many AWS real time scenario use case in my channel.
I will keep creating video related to AWS services, Python or any random technical videos that will surely help you.
You can let me know in comment section if you want me to create video on AWS/Python
connect with me on LinkedIn:-
www.linkedin.com/in/sumit-kumar-877777ba/
Mobile Number :- +918147085086
How to Set Up Databricks Repo for Git Integration | Step-by-Step Guide
Looking to streamline your development workflow in Databricks? In this tutorial, I’ll show you how to set up Databricks Repo for Git Integration step by step! We’ll cover the complete process from configuring the Git provider to managing your code directly from Databricks. Whether you’re using GitHub, GitLab, or Azure DevOps, this guide will walk you through the essential setup so you can start collaborating with your team seamlessly.
🔹 What you'll learn:
How to connect your Git repository to Databricks
Version control with Databricks Repo
Committing, pushing, and pulling code in Databricks
Tips to manage Databricks notebooks with Git
🔹 What you'll learn:
How to connect your Git repository to Databricks
Version control with Databricks Repo
Committing, pushing, and pulling code in Databricks
Tips to manage Databricks notebooks with Git
Переглядів: 131
Відео
How to Create a FREE Databricks Account on Azure Cloud | Step-by-Step Guide
Переглядів 111Місяць тому
Want to start exploring Databricks on Azure for free? 🚀 In this video, I’ll show you how to create a Databricks account on Azure Cloud with a free trial, step by step! Perfect for beginners who want to get started with data engineering, machine learning, or big data analytics using Databricks. Make sure to follow along and subscribe for more cloud tutorials! 🌩️ #Databricks #AzureCloud #FreeTria...
How to Enable & Browse DBFS in Azure Databricks
Переглядів 223Місяць тому
Learn how to quickly enable and browse DBFS (Databricks File System) in Azure Databricks! 🌐 Whether you're managing data or just getting started with Databricks, this quick guide will show you how to navigate your storage with ease. 🚀 Perfect for cloud engineers and data enthusiasts looking to simplify their data workflows! #AzureDatabricks #DBFS #TechTutorial
How to Change AWS Lambda Runtime in 30 Seconds! 🚀 #AWS #lambda #shorts
Переглядів 191Місяць тому
Need to update your AWS Lambda function runtime? 🖥️ In this quick tutorial, I’ll show you how to change the runtime of your AWS Lambda function in less than 60 seconds! Perfect for beginners or anyone needing a fast refresh! 🚀 Watch, learn, and boost your AWS skills! #AWS #Lambda #CloudComputing
AWS OpenSearch Quick Start Guide from Scratch | Free Tier Tutorial
Переглядів 1803 місяці тому
Welcome to our AWS OpenSearch Quick Start Guide! In this video, we'll walk you through setting up and using AWS OpenSearch from scratch, all on the Free Tier. Whether you're a beginner or looking to expand your cloud computing skills, this tutorial covers everything you need to get started with AWS OpenSearch. We'll cover: What is AWS OpenSearch? Setting up an OpenSearch domain Configuring acce...
Understanding the Terraform Lookup Function: A Complete Guide
Переглядів 613 місяці тому
"Welcome to our in-depth guide on the Terraform lookup function! In this video, we'll explore one of Terraform's most powerful functions and show you how to use it effectively in your infrastructure as code projects. What You'll Learn: Introduction to Terraform and its core concepts Detailed explanation of the lookup function and its syntax Practical examples of using the lookup function in rea...
Mastering Terraform Functions: A Complete Guide for Beginners
Переглядів 513 місяці тому
Welcome to our comprehensive guide on Terraform functions! In this video, we'll take you through everything you need to know to get started with Terraform functions, from basic concepts to advanced usage. What You'll Learn: Introduction to Terraform and its core concepts Detailed explanation of Terraform functions and their importance How to use built-in functions for string manipulation, numer...
Master Terraform Map Variables: Key to Efficient Infrastructure Coding!
Переглядів 493 місяці тому
"Dive deep into the world of Terraform map variables in this detailed tutorial! Map variables are powerful tools in Terraform that allow you to manage related values with ease. This video is designed for both beginners and experienced users who want to enhance their understanding and utilization of map variables in Terraform. 🔸 What You’ll Learn: Fundamentals of map variables in Terraform How t...
Mastering Terraform Lists: Efficiently Manage Multiple Values!
Переглядів 373 місяці тому
"🚀 Welcome to our deep dive into Terraform Lists! In this tutorial, we'll explore how to define, manipulate, and utilize lists in Terraform to streamline your infrastructure configurations. Whether you're a beginner or looking to enhance your skills, this video is your ultimate guide to mastering list variables in Terraform. 🔹 What You'll Learn: How to define list variables in Terraform Practic...
Terraform Variable Validation Tutorial: Ensuring Correct Values in Your Configuration
Переглядів 314 місяці тому
In this video, we will delve into Terraform variable validation, a crucial feature to ensure your infrastructure configuration values are correct. We'll cover: How to define variables with validation in Terraform Adding custom validation conditions to variables Practical examples demonstrating variable validation Tips for writing effective validation conditions Whether you're new to Terraform o...
Terraform Variables Tutorial: Defining Types and Default Values
Переглядів 214 місяці тому
Terraform Variables Tutorial: Defining Types and Default Values
Terraform Variables Tutorial: Mastering Input and Output Variables
Переглядів 364 місяці тому
Welcome to our Terraform tutorial series! In this video, we will dive deep into Terraform variables, focusing on both input and output variables. Understanding how to use variables in Terraform is crucial for writing flexible and reusable infrastructure as code. In this comprehensive guide, you will learn: - What Terraform variables are and why they are important - How to declare and use input ...
Terraform Tutorial: Using Output Variables in a Hello World Example
Переглядів 354 місяці тому
Terraform Tutorial: Using Output Variables in a Hello World Example
Create an AWS S3 Bucket with Terraform: Step-by-Step Tutorial
Переглядів 2694 місяці тому
Create an AWS S3 Bucket with Terraform: Step-by-Step Tutorial
Mastering Databricks: How to Pass Parameters in Workflows|Pyspark|Databricks|workflow|python
Переглядів 9815 місяців тому
Mastering Databricks: How to Pass Parameters in Workflows|Pyspark|Databricks|workflow|python
Create Workflow for Notebooks in Databricks: Step-by-Step Guide |Databricks|pyspark|python|notebook
Переглядів 1615 місяців тому
Create Workflow for Notebooks in Databricks: Step-by-Step Guide |Databricks|pyspark|python|notebook
Master Databricks Widgets: Simplifying Your Workflow with dbutils.widgets
Переглядів 1495 місяців тому
Master Databricks Widgets: Simplifying Your Workflow with dbutils.widgets
How to Schedule AWS Lambda Functions Using EventBridge | Step-by-Step Guide #aws #lambda #python
Переглядів 1935 місяців тому
How to Schedule AWS Lambda Functions Using EventBridge | Step-by-Step Guide #aws #lambda #python
How to Trigger a SageMaker Jupyter Notebook File from AWS Lambda using WebSockets
Переглядів 5825 місяців тому
How to Trigger a SageMaker Jupyter Notebook File from AWS Lambda using WebSockets
New PySpark Feature: Run SQL Queries Directly on DataFrames Without using Temp Views|Pyspark|sql
Переглядів 1185 місяців тому
New PySpark Feature: Run SQL Queries Directly on DataFrames Without using Temp Views|Pyspark|sql
How to Call One Databricks Notebook from Another: Step-by-Step Guide
Переглядів 1375 місяців тому
How to Call One Databricks Notebook from Another: Step-by-Step Guide
Effortlessly Create Your First Compute Cluster & Databricks Notebook!
Переглядів 1978 місяців тому
Effortlessly Create Your First Compute Cluster & Databricks Notebook!
Getting Started with Databricks Community Edition: A Step-by-Step Guide
Переглядів 2628 місяців тому
Getting Started with Databricks Community Edition: A Step-by-Step Guide
Mastering Python: =, ==, vs is and Mutable vs Immutable |understanding variables in Python
Переглядів 808 місяців тому
Mastering Python: =, , vs is and Mutable vs Immutable |understanding variables in Python
Mastering Airflow:Effortlessly Connect to PostgreSQL| Airflow Connection connect to Postgres#airflow
Переглядів 1,7 тис.9 місяців тому
Mastering Airflow:Effortlessly Connect to PostgreSQL| Airflow Connection connect to Postgres#airflow
Python Interview Question | LeetCode Python Tutorial : Fibonacci Number Problem
Переглядів 729 місяців тому
Python Interview Question | LeetCode Python Tutorial : Fibonacci Number Problem
Python Interview Question: Find the Maximum Time from a Four-Digit Number|python |coding interview
Переглядів 11810 місяців тому
Python Interview Question: Find the Maximum Time from a Four-Digit Number|python |coding interview
Python Interview Question: LeetCode Python Tutorial: Two Sum Problem|python|leetcode
Переглядів 6410 місяців тому
Python Interview Question: LeetCode Python Tutorial: Two Sum Problem|python|leetcode
Python Interview Question:Find the pair with given number in a list|Two sum problem|leetcode|python
Переглядів 11610 місяців тому
Python Interview Question:Find the pair with given number in a list|Two sum problem|leetcode|python
Python Interview Question: Reverse Words in a Sentence with Code Examples|String operation|regex
Переглядів 5110 місяців тому
Python Interview Question: Reverse Words in a Sentence with Code Examples|String operation|regex
TnQ
Thanks please subscribe
tq its use
@@sreenugangadevi9975 thanks. Please subscribe
wow thanks for this :)
@@chandankumarthakur08 thanks😄
@sumit kumar - I followed the same steps like you did. Still I got the error while connecting to SFTP usin gWinscp. I says "Access Denied". What could be the reason? I created the bucket role, keys are spelled correctly (case-sensitive), and entered the correct passord in Winscp etc. Can you please help here?
@@syedsimra may be role/policy issue.
@@sumitkumar2955 Actually I fixed the issue. Looks like this video is outdated. The updated pusblished documentation gave me the fix. In the old post the secret key in format - SFTP/username. But, in the new post it gave the correct updated format for username - aws/transfer/server-id/username
@@syedsimra yes, this is very old video. Maybe I have to create a new video. But thanks for the information. It will help others. 🙏👍
Was dbfs browser unable forever? I can't access to this option. 😢
@@fisicateca17 may be you don't have admin role
Brother why this dbfs setting not showing in my account
Naveed Bhai , you may not have an admin account.
@@sumitkumar2955what should I do?
What should I do, I'm using community edition@@sumitkumar2955
Hi @Sumit Kumar. This is workinng with notebook instances. But not working with Sagemaker Studio Juypter lab notebooks. Can you please help.
@@ranjeethrikkala6344 sure i will check and let you know
@@sumitkumar2955 Hi Sumit. Have you found out the soultion for Sagemaker Studio Notebook. Also above code is not working when notebook instance is stopped. Which will require us to start it manually. In which case automation with trigger is not served.
hi the command docker-compose up -d --no-deps --build airflow-webserver airflow-scheduler is changing the containers (you can notice from the containers IDs) means when i want to add a requirement all my data in the containers will be lost! what is the solution!
I followed as per the vedio but the dags are not visible in UI could you help me out
@@YashKumarJain-v5c thanks for watching. Please check permission issue, role. If not working ping me on what's app. 8147085086
@@sumitkumar2955 Thank you for the reply, i solved the issue, i was using already set up VPC and there was no NAT enabled.
Thanks for Video. receiving error white test lamda function "An error occurred (ValidationException) when calling the StartNotebookInstance operation: Status (InService) not in ([Stopped, Failed]). Unable to transition to (Pending) for Notebook Instance" Please help me.
you need to start the notebook instance. Maybe another lamdbda to start it first if you want it fully automated.
can we run the lambda function without opening the terminal. i am only able to run it if i am opening terminal, the notebook run is successful. if i am not opening terminal, lambda function is successful then also but sagemaker notebook is not running
"Can you run the lambda function without opening the terminal "- what does it mean could you please tell me which terminal you opening and running lambda?
Hi, I found the hard way that the terminal needs to running as well for the code to work which is pretty expensive. I moved to lifecycle configuration and using nohup and it's working. I use lambda to start and stop the instance.
Thanks, Mate. It helped
@@rithishkonduri508 happy to know 😊
Thanks bro 🧡
@@shubhampoul1643 please subscribe and share with your friends. Thanks 😊👍
Pls sir Add the Code sheet in Description.
Sure, but have I done any coding in this video?
does this starts the notebook instance by itself or we have to run this while instance is running?
Yes, we have to run this while instance is running..Thanks
Commendable video, thanks a lot. Would appreciate if a lot of missing steps could be documneted in the doc also
Thanks 🙏 I guess,I have updated all the steps but I will check and update in below blog. deltafrog.net/trigger-sagemaker-jupyter-notebook-file-from-aws-lambda/
Thank you so much! Proven it works!
Happy that it helped you 😃
@SumitKumar getting below errors when running df.show() Py4JJavaError: An error occurred while calling o77.showString. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1) (LAPTOP-6C4N7D8I executor driver): org.apache.spark.SparkException: Python worker failed to connect back. Caused by: java.net.SocketTimeoutException: Accept timed out
How to resolve error: externally-managed-environment while installing apache airflow?
What step are you following? please provide more information
❤
Thanks, this is very helpful!
Thanks 😊 please subscribe for more such videos
Which API ? Did you use? Which account is the free account?
unfortunately,API is not working for me AS WELL
Very informative
Amazing, thanks for your tutorials and help
Thanks for watching
Great tutorial
Thanks Bhai
Nice video sir
Thanks Bhai
Free at last🤩. After Several Videos, my df.show() always gives an error. Thanks a lot! This video is a life Saver
Hi Sumit bro, i have applied this approach to get the dbt packages installed but still im getting module not found error. Although the installation went smooth. Need help
Ping me on WhatsApp 8147085086
Can you please tell me about user amit ? when did you created that ?
Can you please share video link for Create SFTP server for S3 with username and password authentication without using cloud formation template
Can you please share the link please:-Create SFTP server for S3 with username and password authentication without cloud formation template
Hope you got the link
That is very good feature, thanks for sharing Sir.
👍
But the screen is blurred, please check.
Sure ....can you change the quality to 2k and check
Superb👌
Great thanks a lot
Is there any way i can upload a .parquet file into the postgre database ?, from my local machine to the postgre container ?
You can use python code to read parquet file and prepare data frame the you can write dataframe into postgre DB
@@sumitkumar2955 But if the parquet file is on my local computer ? how can i make a dag to upload the data to the postgreSQL db ? By the way, thanks for your response
@@notanspameratall7293 where is your airflow running? If it is running local. You can use the same python code in create dag.
@@sumitkumar2955 Im currently using Airflow in a Docker compose, im trying to insert datasets from local machine into the postgresql database.
@@notanspameratall7293 ua-cam.com/video/qM_jQ7XcJ88/v-deo.htmlsi=e9jOh0kxAQk-fL-r Could you please go through this video. It will be helpful. I am saving file in local and reading this. So you have you move your file in the container first using docker file. Once you have file in ur airflow container you can easily read. Please try. Let me know if you have any issue
Solved my problem! I Appreciate it
Thanks please share and subscribe 🤠
Thank you! It is working. I really appreciate your video and support! 👍
Thanks 🙏
can I retrieve data from Twitter by basic developer account? I need data from 01/01/2024 until 31/01/2024. I would be grateful if someone could help me with this issue. I am very new to twitter developer and do not know how to scrape data from twitter for 30 days in January. could u please help me with this issue?
Unfortunately My API key not working
After so many videos this has worked for me
Happy to hear that it helped you. Please share and subscribe 🙏
is it working in a container or do i have to specify that
This is AWS managed airflow,you don't have to specify anything
Thanks for the tutorial. I tried to use your steps to send value_counts via ses but it wasn’t displaying well. The email was delivered but the result is not well formatted. I was sending df[[‘col1’,’col2’]].value_counts().to_frame(). Any hints?
I actually solved it. I transposed the result after converting to a new dataframe
Thanks for watching. Happy that it helped you. Please share and subscribe. You can connect with me in case of any issue
Is there any way to fetch tweets for free?
I am not sure. But some of my views have been purchased then also it's not working for him.
Thank you very much for this!!!!!
Thanks, please subscribe and share
Nice, can you please show me how to write requirment.txt file, I need to install talend and snowflake
Hi, I am trying to install langchain py package with the dockerfile, import is unsuccessful due to python version. How to change python version to 3.10 with same airflow version
You have mention python version in your dockerfile
bro! nice video!
Thanks 👍
Thanks Bro, you are a saver.
Welcome 🙂 please share and subscribe
good tutorial bro
Thanks bro
Instead postgress sql can we use mongodb with airflow to store dag related information
Postgress is for structural data(sql) RDBMS and mongo db is nosql. Any specific reason you want to use nosql to store metadata information? I have not used mongo db to store dag related information. I am not sure.