sumit kumar
sumit kumar
  • 180
  • 230 715
How to Set Up Databricks Repo for Git Integration | Step-by-Step Guide
Looking to streamline your development workflow in Databricks? In this tutorial, I’ll show you how to set up Databricks Repo for Git Integration step by step! We’ll cover the complete process from configuring the Git provider to managing your code directly from Databricks. Whether you’re using GitHub, GitLab, or Azure DevOps, this guide will walk you through the essential setup so you can start collaborating with your team seamlessly.
🔹 What you'll learn:
How to connect your Git repository to Databricks
Version control with Databricks Repo
Committing, pushing, and pulling code in Databricks
Tips to manage Databricks notebooks with Git
Переглядів: 131

Відео

How to Create a FREE Databricks Account on Azure Cloud | Step-by-Step Guide
Переглядів 111Місяць тому
Want to start exploring Databricks on Azure for free? 🚀 In this video, I’ll show you how to create a Databricks account on Azure Cloud with a free trial, step by step! Perfect for beginners who want to get started with data engineering, machine learning, or big data analytics using Databricks. Make sure to follow along and subscribe for more cloud tutorials! 🌩️ #Databricks #AzureCloud #FreeTria...
How to Enable & Browse DBFS in Azure Databricks
Переглядів 223Місяць тому
Learn how to quickly enable and browse DBFS (Databricks File System) in Azure Databricks! 🌐 Whether you're managing data or just getting started with Databricks, this quick guide will show you how to navigate your storage with ease. 🚀 Perfect for cloud engineers and data enthusiasts looking to simplify their data workflows! #AzureDatabricks #DBFS #TechTutorial
How to Change AWS Lambda Runtime in 30 Seconds! 🚀 #AWS #lambda #shorts
Переглядів 191Місяць тому
Need to update your AWS Lambda function runtime? 🖥️ In this quick tutorial, I’ll show you how to change the runtime of your AWS Lambda function in less than 60 seconds! Perfect for beginners or anyone needing a fast refresh! 🚀 Watch, learn, and boost your AWS skills! #AWS #Lambda #CloudComputing
AWS OpenSearch Quick Start Guide from Scratch | Free Tier Tutorial
Переглядів 1803 місяці тому
Welcome to our AWS OpenSearch Quick Start Guide! In this video, we'll walk you through setting up and using AWS OpenSearch from scratch, all on the Free Tier. Whether you're a beginner or looking to expand your cloud computing skills, this tutorial covers everything you need to get started with AWS OpenSearch. We'll cover: What is AWS OpenSearch? Setting up an OpenSearch domain Configuring acce...
Understanding the Terraform Lookup Function: A Complete Guide
Переглядів 613 місяці тому
"Welcome to our in-depth guide on the Terraform lookup function! In this video, we'll explore one of Terraform's most powerful functions and show you how to use it effectively in your infrastructure as code projects. What You'll Learn: Introduction to Terraform and its core concepts Detailed explanation of the lookup function and its syntax Practical examples of using the lookup function in rea...
Mastering Terraform Functions: A Complete Guide for Beginners
Переглядів 513 місяці тому
Welcome to our comprehensive guide on Terraform functions! In this video, we'll take you through everything you need to know to get started with Terraform functions, from basic concepts to advanced usage. What You'll Learn: Introduction to Terraform and its core concepts Detailed explanation of Terraform functions and their importance How to use built-in functions for string manipulation, numer...
Master Terraform Map Variables: Key to Efficient Infrastructure Coding!
Переглядів 493 місяці тому
"Dive deep into the world of Terraform map variables in this detailed tutorial! Map variables are powerful tools in Terraform that allow you to manage related values with ease. This video is designed for both beginners and experienced users who want to enhance their understanding and utilization of map variables in Terraform. 🔸 What You’ll Learn: Fundamentals of map variables in Terraform How t...
Mastering Terraform Lists: Efficiently Manage Multiple Values!
Переглядів 373 місяці тому
"🚀 Welcome to our deep dive into Terraform Lists! In this tutorial, we'll explore how to define, manipulate, and utilize lists in Terraform to streamline your infrastructure configurations. Whether you're a beginner or looking to enhance your skills, this video is your ultimate guide to mastering list variables in Terraform. 🔹 What You'll Learn: How to define list variables in Terraform Practic...
Terraform Variable Validation Tutorial: Ensuring Correct Values in Your Configuration
Переглядів 314 місяці тому
In this video, we will delve into Terraform variable validation, a crucial feature to ensure your infrastructure configuration values are correct. We'll cover: How to define variables with validation in Terraform Adding custom validation conditions to variables Practical examples demonstrating variable validation Tips for writing effective validation conditions Whether you're new to Terraform o...
Terraform Variables Tutorial: Defining Types and Default Values
Переглядів 214 місяці тому
Terraform Variables Tutorial: Defining Types and Default Values
Terraform Variables Tutorial: Mastering Input and Output Variables
Переглядів 364 місяці тому
Welcome to our Terraform tutorial series! In this video, we will dive deep into Terraform variables, focusing on both input and output variables. Understanding how to use variables in Terraform is crucial for writing flexible and reusable infrastructure as code. In this comprehensive guide, you will learn: - What Terraform variables are and why they are important - How to declare and use input ...
Terraform Tutorial: Using Output Variables in a Hello World Example
Переглядів 354 місяці тому
Terraform Tutorial: Using Output Variables in a Hello World Example
Create an AWS S3 Bucket with Terraform: Step-by-Step Tutorial
Переглядів 2694 місяці тому
Create an AWS S3 Bucket with Terraform: Step-by-Step Tutorial
Mastering Databricks: How to Pass Parameters in Workflows|Pyspark|Databricks|workflow|python
Переглядів 9815 місяців тому
Mastering Databricks: How to Pass Parameters in Workflows|Pyspark|Databricks|workflow|python
Create Workflow for Notebooks in Databricks: Step-by-Step Guide |Databricks|pyspark|python|notebook
Переглядів 1615 місяців тому
Create Workflow for Notebooks in Databricks: Step-by-Step Guide |Databricks|pyspark|python|notebook
Master Databricks Widgets: Simplifying Your Workflow with dbutils.widgets
Переглядів 1495 місяців тому
Master Databricks Widgets: Simplifying Your Workflow with dbutils.widgets
How to Schedule AWS Lambda Functions Using EventBridge | Step-by-Step Guide #aws #lambda #python
Переглядів 1935 місяців тому
How to Schedule AWS Lambda Functions Using EventBridge | Step-by-Step Guide #aws #lambda #python
How to Trigger a SageMaker Jupyter Notebook File from AWS Lambda using WebSockets
Переглядів 5825 місяців тому
How to Trigger a SageMaker Jupyter Notebook File from AWS Lambda using WebSockets
New PySpark Feature: Run SQL Queries Directly on DataFrames Without using Temp Views|Pyspark|sql
Переглядів 1185 місяців тому
New PySpark Feature: Run SQL Queries Directly on DataFrames Without using Temp Views|Pyspark|sql
How to Call One Databricks Notebook from Another: Step-by-Step Guide
Переглядів 1375 місяців тому
How to Call One Databricks Notebook from Another: Step-by-Step Guide
Effortlessly Create Your First Compute Cluster & Databricks Notebook!
Переглядів 1978 місяців тому
Effortlessly Create Your First Compute Cluster & Databricks Notebook!
Getting Started with Databricks Community Edition: A Step-by-Step Guide
Переглядів 2628 місяців тому
Getting Started with Databricks Community Edition: A Step-by-Step Guide
Mastering Python: =, ==, vs is and Mutable vs Immutable |understanding variables in Python
Переглядів 808 місяців тому
Mastering Python: =, , vs is and Mutable vs Immutable |understanding variables in Python
Mastering Airflow:Effortlessly Connect to PostgreSQL| Airflow Connection connect to Postgres#airflow
Переглядів 1,7 тис.9 місяців тому
Mastering Airflow:Effortlessly Connect to PostgreSQL| Airflow Connection connect to Postgres#airflow
Python Interview Question | LeetCode Python Tutorial : Fibonacci Number Problem
Переглядів 729 місяців тому
Python Interview Question | LeetCode Python Tutorial : Fibonacci Number Problem
Python Interview Question: Find the Maximum Time from a Four-Digit Number|python |coding interview
Переглядів 11810 місяців тому
Python Interview Question: Find the Maximum Time from a Four-Digit Number|python |coding interview
Python Interview Question: LeetCode Python Tutorial: Two Sum Problem|python|leetcode
Переглядів 6410 місяців тому
Python Interview Question: LeetCode Python Tutorial: Two Sum Problem|python|leetcode
Python Interview Question:Find the pair with given number in a list|Two sum problem|leetcode|python
Переглядів 11610 місяців тому
Python Interview Question:Find the pair with given number in a list|Two sum problem|leetcode|python
Python Interview Question: Reverse Words in a Sentence with Code Examples|String operation|regex
Переглядів 5110 місяців тому
Python Interview Question: Reverse Words in a Sentence with Code Examples|String operation|regex

КОМЕНТАРІ

  • @dark-crawler
    @dark-crawler 3 дні тому

    TnQ

  • @sreenugangadevi9975
    @sreenugangadevi9975 3 дні тому

    tq its use

    • @sumitkumar2955
      @sumitkumar2955 3 дні тому

      @@sreenugangadevi9975 thanks. Please subscribe

  • @chandankumarthakur08
    @chandankumarthakur08 6 днів тому

    wow thanks for this :)

  • @syedsimra
    @syedsimra 22 дні тому

    @sumit kumar - I followed the same steps like you did. Still I got the error while connecting to SFTP usin gWinscp. I says "Access Denied". What could be the reason? I created the bucket role, keys are spelled correctly (case-sensitive), and entered the correct passord in Winscp etc. Can you please help here?

    • @sumitkumar2955
      @sumitkumar2955 21 день тому

      @@syedsimra may be role/policy issue.

    • @syedsimra
      @syedsimra 15 днів тому

      @@sumitkumar2955 Actually I fixed the issue. Looks like this video is outdated. The updated pusblished documentation gave me the fix. In the old post the secret key in format - SFTP/username. But, in the new post it gave the correct updated format for username - aws/transfer/server-id/username

    • @sumitkumar2955
      @sumitkumar2955 15 днів тому

      @@syedsimra yes, this is very old video. Maybe I have to create a new video. But thanks for the information. It will help others. 🙏👍

  • @fisicateca17
    @fisicateca17 25 днів тому

    Was dbfs browser unable forever? I can't access to this option. 😢

    • @sumitkumar2955
      @sumitkumar2955 24 дні тому

      @@fisicateca17 may be you don't have admin role

  • @NaveedKhan_777
    @NaveedKhan_777 26 днів тому

    Brother why this dbfs setting not showing in my account

    • @sumitkumar2955
      @sumitkumar2955 25 днів тому

      Naveed Bhai , you may not have an admin account.

    • @NaveedKhan_777
      @NaveedKhan_777 25 днів тому

      @@sumitkumar2955what should I do?

    • @NaveedKhan_777
      @NaveedKhan_777 25 днів тому

      What should I do, I'm using community edition​@@sumitkumar2955

  • @ranjeethrikkala6344
    @ranjeethrikkala6344 Місяць тому

    Hi @Sumit Kumar. This is workinng with notebook instances. But not working with Sagemaker Studio Juypter lab notebooks. Can you please help.

    • @sumitkumar2955
      @sumitkumar2955 Місяць тому

      @@ranjeethrikkala6344 sure i will check and let you know

    • @ranjeethrikkala6344
      @ranjeethrikkala6344 Місяць тому

      @@sumitkumar2955 Hi Sumit. Have you found out the soultion for Sagemaker Studio Notebook. Also above code is not working when notebook instance is stopped. Which will require us to start it manually. In which case automation with trigger is not served.

  • @arafatabsi6546
    @arafatabsi6546 Місяць тому

    hi the command docker-compose up -d --no-deps --build airflow-webserver airflow-scheduler is changing the containers (you can notice from the containers IDs) means when i want to add a requirement all my data in the containers will be lost! what is the solution!

  • @YashKumarJain-v5c
    @YashKumarJain-v5c 2 місяці тому

    I followed as per the vedio but the dags are not visible in UI could you help me out

    • @sumitkumar2955
      @sumitkumar2955 2 місяці тому

      @@YashKumarJain-v5c thanks for watching. Please check permission issue, role. If not working ping me on what's app. 8147085086

    • @YashKumarJain-v5c
      @YashKumarJain-v5c Місяць тому

      @@sumitkumar2955 Thank you for the reply, i solved the issue, i was using already set up VPC and there was no NAT enabled.

  • @mayankv83
    @mayankv83 2 місяці тому

    Thanks for Video. receiving error white test lamda function "An error occurred (ValidationException) when calling the StartNotebookInstance operation: Status (InService) not in ([Stopped, Failed]). Unable to transition to (Pending) for Notebook Instance" Please help me.

    • @sidhukadi
      @sidhukadi Місяць тому

      you need to start the notebook instance. Maybe another lamdbda to start it first if you want it fully automated.

  • @AbhishekRoy-w2f
    @AbhishekRoy-w2f 2 місяці тому

    can we run the lambda function without opening the terminal. i am only able to run it if i am opening terminal, the notebook run is successful. if i am not opening terminal, lambda function is successful then also but sagemaker notebook is not running

    • @sumitkumar2955
      @sumitkumar2955 2 місяці тому

      "Can you run the lambda function without opening the terminal "- what does it mean could you please tell me which terminal you opening and running lambda?

    • @sidhukadi
      @sidhukadi Місяць тому

      Hi, I found the hard way that the terminal needs to running as well for the code to work which is pretty expensive. I moved to lifecycle configuration and using nohup and it's working. I use lambda to start and stop the instance.

  • @rithishkonduri508
    @rithishkonduri508 3 місяці тому

    Thanks, Mate. It helped

    • @sumitkumar2955
      @sumitkumar2955 3 місяці тому

      @@rithishkonduri508 happy to know 😊

  • @shubhampoul1643
    @shubhampoul1643 3 місяці тому

    Thanks bro 🧡

    • @sumitkumar2955
      @sumitkumar2955 3 місяці тому

      @@shubhampoul1643 please subscribe and share with your friends. Thanks 😊👍

  • @Safar-e4o
    @Safar-e4o 3 місяці тому

    Pls sir Add the Code sheet in Description.

    • @sumitkumar2955
      @sumitkumar2955 3 місяці тому

      Sure, but have I done any coding in this video?

  • @aashishpant
    @aashishpant 3 місяці тому

    does this starts the notebook instance by itself or we have to run this while instance is running?

    • @sumitkumar2955
      @sumitkumar2955 3 місяці тому

      Yes, we have to run this while instance is running..Thanks

  • @vaibhavgupta7429
    @vaibhavgupta7429 3 місяці тому

    Commendable video, thanks a lot. Would appreciate if a lot of missing steps could be documneted in the doc also

    • @sumitkumar2955
      @sumitkumar2955 3 місяці тому

      Thanks 🙏 I guess,I have updated all the steps but I will check and update in below blog. deltafrog.net/trigger-sagemaker-jupyter-notebook-file-from-aws-lambda/

  • @yejinzai
    @yejinzai 4 місяці тому

    Thank you so much! Proven it works!

  • @iamrahul_29
    @iamrahul_29 4 місяці тому

    @SumitKumar getting below errors when running df.show() Py4JJavaError: An error occurred while calling o77.showString. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1) (LAPTOP-6C4N7D8I executor driver): org.apache.spark.SparkException: Python worker failed to connect back. Caused by: java.net.SocketTimeoutException: Accept timed out

  • @nabeelasyed1034
    @nabeelasyed1034 4 місяці тому

    How to resolve error: externally-managed-environment while installing apache airflow?

    • @sumitkumar2955
      @sumitkumar2955 4 місяці тому

      What step are you following? please provide more information

  • @reneshmlal2809
    @reneshmlal2809 4 місяці тому

  • @wintercherryblossom4020
    @wintercherryblossom4020 5 місяців тому

    Thanks, this is very helpful!

    • @sumitkumar2955
      @sumitkumar2955 5 місяців тому

      Thanks 😊 please subscribe for more such videos

  • @droptimistic7419
    @droptimistic7419 5 місяців тому

    Which API ? Did you use? Which account is the free account?

    • @sumitkumar2955
      @sumitkumar2955 5 місяців тому

      unfortunately,API is not working for me AS WELL

  • @abushama1638
    @abushama1638 5 місяців тому

    Very informative

  • @notanspameratall7293
    @notanspameratall7293 5 місяців тому

    Amazing, thanks for your tutorials and help

  • @abushama1638
    @abushama1638 5 місяців тому

    Great tutorial

  • @TechnoSparkBigData
    @TechnoSparkBigData 5 місяців тому

    Nice video sir

  • @egunjobitunde8369
    @egunjobitunde8369 5 місяців тому

    Free at last🤩. After Several Videos, my df.show() always gives an error. Thanks a lot! This video is a life Saver

  • @AMITDAS-mr8xj
    @AMITDAS-mr8xj 5 місяців тому

    Hi Sumit bro, i have applied this approach to get the dbt packages installed but still im getting module not found error. Although the installation went smooth. Need help

  • @ManishKumar-mj3ko
    @ManishKumar-mj3ko 5 місяців тому

    Can you please tell me about user amit ? when did you created that ?

  • @ManishKumar-mj3ko
    @ManishKumar-mj3ko 5 місяців тому

    Can you please share video link for Create SFTP server for S3 with username and password authentication without using cloud formation template

  • @ManishKumar-mj3ko
    @ManishKumar-mj3ko 5 місяців тому

    Can you please share the link please:-Create SFTP server for S3 with username and password authentication without cloud formation template

  • @TechnoSparkBigData
    @TechnoSparkBigData 5 місяців тому

    That is very good feature, thanks for sharing Sir.

  • @TheSarfarazahmed
    @TheSarfarazahmed 5 місяців тому

    But the screen is blurred, please check.

    • @sumitkumar2955
      @sumitkumar2955 5 місяців тому

      Sure ....can you change the quality to 2k and check

  • @TheSarfarazahmed
    @TheSarfarazahmed 5 місяців тому

    Superb👌

  • @abushama1638
    @abushama1638 5 місяців тому

    Great thanks a lot

  • @notanspameratall7293
    @notanspameratall7293 5 місяців тому

    Is there any way i can upload a .parquet file into the postgre database ?, from my local machine to the postgre container ?

    • @sumitkumar2955
      @sumitkumar2955 5 місяців тому

      You can use python code to read parquet file and prepare data frame the you can write dataframe into postgre DB

    • @notanspameratall7293
      @notanspameratall7293 5 місяців тому

      @@sumitkumar2955 But if the parquet file is on my local computer ? how can i make a dag to upload the data to the postgreSQL db ? By the way, thanks for your response

    • @sumitkumar2955
      @sumitkumar2955 5 місяців тому

      @@notanspameratall7293 where is your airflow running? If it is running local. You can use the same python code in create dag.

    • @notanspameratall7293
      @notanspameratall7293 5 місяців тому

      @@sumitkumar2955 Im currently using Airflow in a Docker compose, im trying to insert datasets from local machine into the postgresql database.

    • @sumitkumar2955
      @sumitkumar2955 5 місяців тому

      @@notanspameratall7293 ua-cam.com/video/qM_jQ7XcJ88/v-deo.htmlsi=e9jOh0kxAQk-fL-r Could you please go through this video. It will be helpful. I am saving file in local and reading this. So you have you move your file in the container first using docker file. Once you have file in ur airflow container you can easily read. Please try. Let me know if you have any issue

  • @jeffersonsilvadiniz7575
    @jeffersonsilvadiniz7575 5 місяців тому

    Solved my problem! I Appreciate it

    • @sumitkumar2955
      @sumitkumar2955 5 місяців тому

      Thanks please share and subscribe 🤠

  • @narcis.nedelut
    @narcis.nedelut 6 місяців тому

    Thank you! It is working. I really appreciate your video and support! 👍

  • @parisaemkani5730
    @parisaemkani5730 6 місяців тому

    can I retrieve data from Twitter by basic developer account? I need data from 01/01/2024 until 31/01/2024. I would be grateful if someone could help me with this issue. I am very new to twitter developer and do not know how to scrape data from twitter for 30 days in January. could u please help me with this issue?

    • @sumitkumar2955
      @sumitkumar2955 6 місяців тому

      Unfortunately My API key not working

  • @bommanasravan8279
    @bommanasravan8279 6 місяців тому

    After so many videos this has worked for me

    • @sumitkumar2955
      @sumitkumar2955 6 місяців тому

      Happy to hear that it helped you. Please share and subscribe 🙏

  • @SouhaCherif-vp2vw
    @SouhaCherif-vp2vw 6 місяців тому

    is it working in a container or do i have to specify that

    • @sumitkumar2955
      @sumitkumar2955 6 місяців тому

      This is AWS managed airflow,you don't have to specify anything

  • @joshisaiah2054
    @joshisaiah2054 7 місяців тому

    Thanks for the tutorial. I tried to use your steps to send value_counts via ses but it wasn’t displaying well. The email was delivered but the result is not well formatted. I was sending df[[‘col1’,’col2’]].value_counts().to_frame(). Any hints?

    • @joshisaiah2054
      @joshisaiah2054 7 місяців тому

      I actually solved it. I transposed the result after converting to a new dataframe

    • @sumitkumar2955
      @sumitkumar2955 7 місяців тому

      Thanks for watching. Happy that it helped you. Please share and subscribe. You can connect with me in case of any issue

  • @HIMANSHUMISHRA-yf5bm
    @HIMANSHUMISHRA-yf5bm 7 місяців тому

    Is there any way to fetch tweets for free?

    • @sumitkumar2955
      @sumitkumar2955 7 місяців тому

      I am not sure. But some of my views have been purchased then also it's not working for him.

  • @ghassanebentahar5695
    @ghassanebentahar5695 7 місяців тому

    Thank you very much for this!!!!!

    • @sumitkumar2955
      @sumitkumar2955 7 місяців тому

      Thanks, please subscribe and share

  • @Shakeel6429
    @Shakeel6429 7 місяців тому

    Nice, can you please show me how to write requirment.txt file, I need to install talend and snowflake

  • @praveena1752
    @praveena1752 7 місяців тому

    Hi, I am trying to install langchain py package with the dockerfile, import is unsuccessful due to python version. How to change python version to 3.10 with same airflow version

    • @sumitkumar2955
      @sumitkumar2955 7 місяців тому

      You have mention python version in your dockerfile

  • @ghazziwang3147
    @ghazziwang3147 7 місяців тому

    bro! nice video!

  • @FaltuKaam-vq7ko
    @FaltuKaam-vq7ko 7 місяців тому

    Thanks Bro, you are a saver.

    • @sumitkumar2955
      @sumitkumar2955 7 місяців тому

      Welcome 🙂 please share and subscribe

  • @alongsandusit8303
    @alongsandusit8303 7 місяців тому

    good tutorial bro

  • @gouseashwakpattekha4450
    @gouseashwakpattekha4450 7 місяців тому

    Instead postgress sql can we use mongodb with airflow to store dag related information

    • @sumitkumar2955
      @sumitkumar2955 7 місяців тому

      Postgress is for structural data(sql) RDBMS and mongo db is nosql. Any specific reason you want to use nosql to store metadata information? I have not used mongo db to store dag related information. I am not sure.