Pytalista
Pytalista
  • 73
  • 141 496
Deploy Terraform with GitHub Actions [Terraform Cloud API]
In this video I go step by step on how to deploy a Databricks cluster with terraforming using GitHub Actions and Terraform Cloud Api
Code: github.com/pedrojunqueira/terraform-cloud-databricks
Tutorial Inspiration: developer.hashicorp.com/terraform/tutorials/automation/github-actions
Databricks Terraform Provider: registry.terraform.io/providers/databricks/databricks/latest/docs/resources/cluster
Переглядів: 29

Відео

Deploy Databricks Workspace with Terraform
Переглядів 172Місяць тому
How to deploy Azure Databricks Workspace with Terraform. GitHub code : github.com/pedrojunqueira/PytalistaYT/tree/master/Python/terraform_adb_ws
Deploy a Unity Catalog Cluster in Azure Databricks using Terraform
Переглядів 121Місяць тому
Steps on how to deploy a Cluster in Azure Databricks using Terraform GitHub: github.com/pedrojunqueira/PytalistaYT/tree/master/Python/terraform-databricks-cluster Terraform Databricks Provider: registry.terraform.io/providers/databricks/databricks/latest/docs
Deploy Resources in Azure with Terraform
Переглядів 150Місяць тому
This video I am going to go step by step how to set up your environment, service principle and deploy to azure a resource group and a storage account using terraform. Terraform documentation: registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/storage_account Install Terraform CLI in linux: developer.hashicorp.com/terraform/tutorials/aws-get-started/install-cli Install Azure...
Deploy Databricks Asset Bundles using GitHub Actions [DevOps]
Переглядів 604Місяць тому
In this video I am going to go step by step on how to deploy databricks asset bundles using GitHub Actions. You will need to set up service principle and generate a Personal Access Token while authenticated Machine to Machine (M2M) in your Databricks cli. GitHub Depository with code and README: github.com/pedrojunqueira/dab-cicd Databricks CLI M2M authentication and service principle set up in ...
Build Python Packages in a Databricks Asset Bundle
Переглядів 4592 місяці тому
In this tutorial I will describe step by step on how to create from scratch a Python package, configure build and deploy in a Databricks Asset Bundle
How to create and deploy Azure Function Using VS Code NO MUSIC
Переглядів 4762 місяці тому
This is the same video of How to create and deploy Azure Function Using VS Code [Python] but without music. I will teach you how to do an a Azure Function step by step from set up the environment to test to deploy. Enjoy. Link to resources How to setup WSL : learn.microsoft.com/en-us/windows/wsl/install How to Set up WSL O-my-posh : ua-cam.com/video/UiHxGjIrq1c/v-deo.html How to Install pyenv i...
How to deploy Databricks Asset Bundle Projects
Переглядів 1,2 тис.2 місяці тому
This video I go step by step on how to deploy a Databricks Asset Bundle from Databricks CLI. Install Databricks CLI: docs.databricks.com/en/dev-tools/cli/install.html This Video Code: github.com/pedrojunqueira/PytalistaYT/tree/master/Python/databricks-asset-bundle YML: aws.amazon.com/compare/the-difference-between-yaml-and-json/#:~:text=Both represent data as key,language to support developer u...
Deploy Storage Account with Bicep and GitHub Actions CI/CD [VS code]
Переглядів 1132 місяці тому
Demo of how to define and deploy a Bicep File using parameters, variables and module using GitHub Actions, VS code in Linux. Azure deployment using CLI. GitHub with code: github.com/pedrojunqueira/bicep-cicd-demo Demos Steps: github.com/pedrojunqueira/bicep-cicd-demo/blob/master/STEPS.md Install CLI: learn.microsoft.com/en-us/cli/azure/install-azure-cli Azure Resource Template Documentation: le...
How to build your ChatGPT clone with Django and HTMX
Переглядів 2133 місяці тому
This is a copy and paste quick tutorial on how to code a Django application that is a "clone" of chatGPT. GitHub Code: github.com/pedrojunqueira/PytalistaYT/blob/master/Python/django_chat_gpt/STEP_BY_STEP.md Azure OpenAI docs: learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart?tabs=command-line,python-new&pivots=programming-language-python
Demo how to CDC with Debezium into Kafka in the cloud
Переглядів 2703 місяці тому
In this video I do a step by step demo on how to do a CDC with Debizium connector into Kafka Confluent then load into a Spark streaming dataframe in Databricks. GitHub Notebook: github.com/pedrojunqueira/PytalistaYT/blob/master/Python/postgresql_debezium_kafka_databricks/cdc_debezium-postgresql.ipynb Debizium documentation: debezium.io/documentation/reference/stable/connectors/postgresql.html#p...
How to read Kafka Topic with Databricks [Confluent]
Переглядів 4343 місяці тому
In this video I will show you how Databricks can connect to a Kafka cluster in Confluent and read a topic to stream it real time to a pyspark dataframe. Structured Streaming. GitHub notebook: github.com/pedrojunqueira/PytalistaYT/blob/master/Python/confluent_kafka_databricks/stream-kafka_cluster.ipynb Article I copy the code: www.confluent.io/blog/consume-avro-data-from-kafka-topics-and-secured...
How to run multiple notebooks in a thread in Databricks [Python]
Переглядів 3463 місяці тому
In this video I am going to demonstrate how to run multiple notebooks in a Python thread by passing parameters to a notebook. GitHub Code for Notebook: github.com/pedrojunqueira/PytalistaYT/blob/master/Python/databricks_notebook_thread/example.ipynb
How to send Python logs to Applications Insights (Azure Monitor)
Переглядів 1,3 тис.3 місяці тому
In this video I am going to show how to set up a Azure Monitor (Applications Insights) to send logs from your Python program/application Azure Documentation: learn.microsoft.com/en-us/previous-versions/azure/azure-monitor/app/opencensus-python
Using Change Data Feed and Structured Streaming in Fabric [PySpark]
Переглядів 2113 місяці тому
In this video I demostrate how to propagate changes from one table to another using Change Data Feed and Structured Streaming and foreachBatch method. GitHub Notebook: github.com/pedrojunqueira/PytalistaYT/blob/master/Python/fabric_cdf_streaming/cdf_example_fabric.ipynb Documentation Delta Table: docs.delta.io/2.4.0/delta-change-data-feed.html#language-python Spark Structured Streaming Document...
How to do Slow Changing Dimension in Delta Tables [Python]
Переглядів 1734 місяці тому
How to do Slow Changing Dimension in Delta Tables [Python]
Handling secrets in Fabric Notebooks - Azure Key Vault
Переглядів 9094 місяці тому
Handling secrets in Fabric Notebooks - Azure Key Vault
How to run dbt in Microsoft Fabric
Переглядів 1,1 тис.5 місяців тому
How to run dbt in Microsoft Fabric
Doing XMLA endpoint in Fabric with a F2 Capacity
Переглядів 2325 місяців тому
Doing XMLA endpoint in Fabric with a F2 Capacity
Work with Azure Data Lake as though it was your local File System [Python adlfs]
Переглядів 2925 місяців тому
Work with Azure Data Lake as though it was your local File System [Python adlfs]
How to upload fake data to azure storage
Переглядів 1516 місяців тому
How to upload fake data to azure storage
Pyspark for SQL developers the most common code examples [Python]
Переглядів 1648 місяців тому
Pyspark for SQL developers the most common code examples [Python]
Databricks Autoloader and Change Data Feed Demo Pipeline [PySpark]
Переглядів 1,8 тис.8 місяців тому
Databricks Autoloader and Change Data Feed Demo Pipeline [PySpark]
How to do logging in Databricks [Python]
Переглядів 1,2 тис.9 місяців тому
How to do logging in Databricks [Python]
How to use Azurite while developing your Azure Function locally [VS code]
Переглядів 4,5 тис.9 місяців тому
How to use Azurite while developing your Azure Function locally [VS code]
Running Spark from VS Code using Databricks Cluster [Python]
Переглядів 1,3 тис.9 місяців тому
Running Spark from VS Code using Databricks Cluster [Python]
How to add logging to your Python Package? [Python]
Переглядів 21410 місяців тому
How to add logging to your Python Package? [Python]
Consume Azure Queues with Azure Function Queue Trigger [Python] V2.
Переглядів 3,9 тис.10 місяців тому
Consume Azure Queues with Azure Function Queue Trigger [Python] V2.
Azure Function Blob Trigger [Python] V2
Переглядів 11 тис.10 місяців тому
Azure Function Blob Trigger [Python] V2
Testing GitHub Actions locally with Act [Linux WSL]
Переглядів 65610 місяців тому
Testing GitHub Actions locally with Act [Linux WSL]

КОМЕНТАРІ

  • @teo11300
    @teo11300 6 днів тому

    what does the name "this" mean?

    • @pytalista
      @pytalista 6 днів тому

      It is just a label for the object. You can call anything you want. Usually if you only have one instance of the object in the module you just call it this.

  • @nathanderhake839
    @nathanderhake839 6 днів тому

    Thank you so much for uploading a version without music. This was very helpful for me.

    • @pytalista
      @pytalista 6 днів тому

      Thanks 🙏🏻. Please subscribe and smash 💥 the like button 😃

  • @mateuszwojcik8512
    @mateuszwojcik8512 7 днів тому

    It is a very greate video. I was looking for very long time for a video like that. Thank you! My code works perfectly when i run it on Visual Studio. But it is not triggered on Azure. Is there any special thing I need to do more? Code is deployed on Azure. I can see the updated code on Azure. Connection strings in Environment Variables are correct. When I run code in Azure I get 202 response. But I get no logs. There are no invocations on Azure.

    • @mateuszwojcik8512
      @mateuszwojcik8512 7 днів тому

      There is no question :). I have figured it out. My connection was connection="BlobStorageConnectionString" and i did not add it to Environments variables. Stupid error.

    • @pytalista
      @pytalista 3 дні тому

      Awesome. Great it worked

    • @pytalista
      @pytalista 3 дні тому

      Great to hear! Below you sorted out. Cheers

  • @deepakpatil5059
    @deepakpatil5059 9 днів тому

    command "databricks bundle init" is getting failed with exit code 1. I have installed databricks cli version 0.223.0 and authentication has been done properly. Can you please suggest what could be the issue?

    • @pytalista
      @pytalista 9 днів тому

      Have you done a databricks auth profiles ? Do you get a green tick ✅?

    • @deepakpatil5059
      @deepakpatil5059 8 днів тому

      @@pytalista Yes, I have used debug and I found the issue. I have used the command databricks --debug bundle init to find the issue. Then it has thrown an error to choose the profile among default and new created by providing --profile flag. Then I used the databricks --debug bundle init --profile "new profile".

    • @pytalista
      @pytalista 8 днів тому

      You mean that now is working ?

    • @deepakpatil5059
      @deepakpatil5059 8 днів тому

      @@pytalista Yes

    • @pytalista
      @pytalista 7 днів тому

      🙌🏻👏🏻

  • @danyalhabib286
    @danyalhabib286 10 днів тому

    Most helpful video

  • @HimenSuthar
    @HimenSuthar 14 днів тому

    This is very helpful. Thank You Very much Have you extended this exercise to create partition-based parquet file inside table folder.?

    • @pytalista
      @pytalista 14 днів тому

      Hi, do not intend to go on this topic. This is a way to bring data to the lake form transactional systems. then from there I would ingest in delta tables.

  • @abhishekprakash4793
    @abhishekprakash4793 15 днів тому

    Thanks for the vedio .......this is extremely useful vedio .Can you please include the vedio on deployment pyspark codes in on prem or cloud

    • @pytalista
      @pytalista 15 днів тому

      Thanks 🙏🏻. I mostly use spark on the cloud which is already implemented. You focus on running code.

  • @usmanrahat2913
    @usmanrahat2913 15 днів тому

    Very useful video particularly around Databricks Rest Api. Thanks

  • @hesatrap7739
    @hesatrap7739 16 днів тому

    Thanks for the video. Would delta live tables , with streaming table , be better for this?

    • @pytalista
      @pytalista 14 днів тому

      Hi good point. Certainly delta live table simplify the code and the developer experience in a more declarative way. This is a decision engineer need to make. DLT cost is a bit higher and some prefer to code more declarative. I would say that the simpler the transformation and more "cookie cutter" I would go for DLT otherwise better to have CDF.

  • @Traveling_with_Tyler
    @Traveling_with_Tyler 23 дні тому

    Thank you for the content. However, databricks bundle init does not work. Error: No such command 'bundle'. Do you know why?

    • @pytalista
      @pytalista 22 дні тому

      Thanks 🙏🏻. Make sure the Databricks cli version is 0.218.0 or higher. docs.databricks.com/en/dev-tools/bundles/index.html

  • @zahraelhaddi6980
    @zahraelhaddi6980 24 дні тому

    if the function doesnt work and it keeps getting nothing, the solution might be going to your funtionApp on Azure, and go to environment variables , and then try to add the connection name as a key and the connection string of your storage account as a value , and it will work.

  • @kevinhertwig6104
    @kevinhertwig6104 29 днів тому

    Great video. I have a question though. You said once you start using DAB, everything should be done without the UI. What if I need to write a Notebook, where do I do this? When writing this in VSCode I think I always need to submit it as a job to databricks for testing and this takes so much time. In the UI it is very fast and easy to execute cell by cell.

    • @pytalista
      @pytalista 28 днів тому

      Yes in this case for speed you can still develop using the UI and when done export the notebook and deploy in a DAB. Also you can run the notebook in VS code by attaching cluster in VS code. But still UI offers a better experience while developing.

  • @beingalien6394
    @beingalien6394 Місяць тому

    Oh I literally check each time of my teams when it is beeping on yours🤣

  • @yaminiyamini2228
    @yaminiyamini2228 Місяць тому

    I have tried this approach, but facing unauthorized error. I have all required access/ roles on Key vault, still unable to figure out the issue

    • @pytalista
      @pytalista Місяць тому

      Is the same user in Fabric been granted the permission in lay vault ?

  • @SergioPolimante
    @SergioPolimante Місяць тому

    Very good, straight to the point. You could make a video running pytest on databricks. Do you think that is a valid approach, since many tests depends on reading data from catalog?

    • @pytalista
      @pytalista Місяць тому

      Hi I think is a valid approach. Test in data engineering is a bit different. I would separate the test into 2 categories. Unit Test which would be test of the functions that is used in your code. Then use pytest for that. Then tests of data that you can use Delta Live tables expectations or libraries like great expectations. This because in data engineer data is stateful.

    • @SergioPolimante
      @SergioPolimante Місяць тому

      @@pytalista Good point. So basicaly, you're suggesting that queries are basically tested by the data quality checks you apply in the result, and not testing the query itself using mock data and expected dataframes?

    • @pytalista
      @pytalista Місяць тому

      @SergioPolimante on data quality you can test things like. Not nulls, unique, data ranges, valid keys etc. that is the part of your data is statefull. About testing the logic of your query with mock data is also a valid approach maybe it is a lower return on time investment than quality expectations checks. Then there is the unit test of your functions you may have in your pipeline in this case you would do classic unit tests.

  • @GururajE-ge1sw
    @GururajE-ge1sw Місяць тому

    Thank you for the nice tutorial brother

  • @Omar-zq4wx
    @Omar-zq4wx Місяць тому

    Hi, in the palette i do not see the option for durable function, but just for trigger function. Any advice?

    • @pytalista
      @pytalista Місяць тому

      follow this then it will make sense learn.microsoft.com/en-us/azure/azure-functions/durable/quickstart-python-vscode?tabs=linux%2Cazure-cli-set-indexing-flag&pivots=python-mode-decorators

    • @carlosr204
      @carlosr204 Місяць тому

      I ran into the same issue, I just did it via the command line. Use 'func templates list' to get the list of templates you want to use.

    • @pytalista
      @pytalista Місяць тому

      @carlosr204 good one thanks 🙏🏻

  • @abhishekprakash4793
    @abhishekprakash4793 Місяць тому

    Thanks I hope you bring some more such content involving pyspark and databricks

    • @pytalista
      @pytalista Місяць тому

      ua-cam.com/video/lYYIFRaY8Tk/v-deo.htmlsi=RiYQqRffGBTVkYZR

  • @diyakhajuria2478
    @diyakhajuria2478 Місяць тому

    Is there any specific reason why the blob size comes out as None everytime? I have a use case which extends blob trigger to call a search service on newly added/modified files, and I'm getting blob not found 404 error 😑 despite the blob being there in my container. Thank you for the video though, it confirmed I'm not doing something astronomically wrong 😂!

    • @pytalista
      @pytalista Місяць тому

      No worries Thanks. Hard to tell without more context. 404 is a not found error. If you follow the tutorial should work exact the same.

    • @diyakhajuria2478
      @diyakhajuria2478 Місяць тому

      @@pytalista Oh yes, I had to implement custom logic for my use case, so the code was quite different. The storage account I was trying to access did not allow anonymous access and an incorrect request URL was being generated. All in all, Blob was not found. I was able to debug it. Your video set a great precedent, thank you for replying.

    • @pytalista
      @pytalista Місяць тому

      @diyakhajuria2478 great work. Happy it helped.

  • @jasonluna5893
    @jasonluna5893 Місяць тому

    Great Video! I was stuck on the how to get the Path variable and you demonstrated it perfectly. The documentation still doesn't have it.

  • @ameliemedem1918
    @ameliemedem1918 Місяць тому

    Awesome video! Great explanations! Thanks à lot

  • @rockypunk91
    @rockypunk91 Місяць тому

    Can we use, even if Microsoft is retiring it on September 30, 2024

    • @pytalista
      @pytalista Місяць тому

      still can use it. only classic will be retired. Use a workspace based one learn.microsoft.com/en-gb/azure/azure-monitor/app/create-workspace-resource?tabs=bicep

  • @harshitgupta3706
    @harshitgupta3706 Місяць тому

    Great demo. Thankyou. One question. Can we use Service Principal instead of PAT token for databricks authentication?

    • @pytalista
      @pytalista Місяць тому

      Generate a PAT to the service principal. Need to do via databricks cli to generate a PAT for SP. do a M2M authentication for that.

  • @susanODilla
    @susanODilla Місяць тому

    Nice tutorial! I am working on an Ubuntu machine and wanted to ask you if you installed the Microsoft ODBC 18 for Ubuntu (as in the Microsoft guide)? Another question: I saw on a different video the they use data stored in a Lakehouse, but was creating a Warehouse ( cross-query across Lakehouses/warehouses from a single SQL Endpoint ). I thought that was not possible (dbt docu: The adapter currently only supports connecting to a warehouse and not a lakehouse endpoint)

    • @pytalista
      @pytalista Місяць тому

      dbt only support connecting to warehouse but your warehouse can do cross query. This is possible. What you cannot do is to query the lakehouse directly from dbt.

    • @susanODilla
      @susanODilla Місяць тому

      @@pytalistaYou mean, what is possible to create a "warehouse" from your lakehouse and then target another real Warehouse? Is this what cross-query across Lakehouses/warehouses from a single SQL Endpoint mean? Thanks for this feedback!

    • @pytalista
      @pytalista Місяць тому

      @@susanODilla No. I said you can query a lakehouse from a warehouse. This is what cross query mean.

  • @manavkumar1071
    @manavkumar1071 Місяць тому

    When pressing F5 button it is giving error in launch json with exit code 1" The terminal process "/bin/bash '-c', '. . venv/bin/activate && func host start' " terminated with exit code 1

    • @pytalista
      @pytalista Місяць тому

      make sure you all the requisite covered. All the installations and your environment is set correctly.

    • @manavkumar1071
      @manavkumar1071 Місяць тому

      @@pytalista I am working on codespace is that making difference??

    • @pytalista
      @pytalista Місяць тому

      @manavkumar1071 i don’t know what is code space. Probably it is. They to configure on your local computer.

  • @Ajmal_Yazdani
    @Ajmal_Yazdani Місяць тому

    Excellent info. Thanks for share. one question how to run VS code in WSL and how you beatify you bash such a pretty? :)

    • @pytalista
      @pytalista Місяць тому

      Hi Ajmal, Just install WSL then on the terminal type code . to open VS that is already installed on windows. To make your terminal pretty watch this video ua-cam.com/video/ehM3vgPcTc8/v-deo.html

  • @Omar-zq4wx
    @Omar-zq4wx Місяць тому

    but the connection string how should it be if i want to run it locallt? In settings i put QueueConnectionString="UseDevelopmentStorage=true" I run the function app locally and it works. Then to send a message I i used you second script (but i'm usign my connection string) but it says ValueError: Connection string missing required connection details.

    • @pytalista
      @pytalista Місяць тому

      Hard without context. But I would watch the video and follow exactly. Looks like you are not inserting your connection string in the code. Read this question that may point you on the right direction. stackoverflow.com/questions/2338650/connection-string-to-an-azure-cloud-storage-account

    • @Omar-zq4wx
      @Omar-zq4wx Місяць тому

      @@pytalista I'm trying to test the queue trigger fucntionapp locally. Then I do not want to use a queue storage that is in portal azure but one for develpment. For example with the eventgrid function in the settings I have this "AzureWebStorage"=UseDevelopmentStorage=true". I'm asking if there is anything similar for the connection string

    • @Omar-zq4wx
      @Omar-zq4wx Місяць тому

      @@pytalista Ok, I solved. If you want to use Azurite and therefore run it locally the connection string on the function app should be "UseDevelopmentStorage=true". Then when u want to send a message the connection string for the queue storage only is DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;QueueEndpoint=127.0.0.1:10001/devstoreaccount1;

  • @srikrishnabhargavgollapudi5340
    @srikrishnabhargavgollapudi5340 Місяць тому

    I have encountered an error called "No job function found". Can I get some help in this regard please?

    • @pytalista
      @pytalista Місяць тому

      Make sure you follow all pre requisites. Installations and etc.

  • @arunkrishna1036
    @arunkrishna1036 Місяць тому

    Thank you so much!!! It worked for me.

  • @EmilianoEmanuelSosa
    @EmilianoEmanuelSosa Місяць тому

    Thankkkks , you helped me a lot! In my first job as a data engineer! Greetings from Mendoza, Arg

  • @pranathivinnakota7046
    @pranathivinnakota7046 Місяць тому

    Thank you ! It helped me so much.

  • @harshitgupta3706
    @harshitgupta3706 Місяць тому

    Great tutorial! You show the exact process used in real time in companies, which is super helpful. Thanks for the effort! Looking forward for more such tutorials.

    • @pytalista
      @pytalista Місяць тому

      Glad it was helpful! 😀

  • @harshitgupta3706
    @harshitgupta3706 2 місяці тому

    Truly amazing!! This is what i was looking for. Please make a video on DAB deployment through Azure DevOps CI/CD pipeline through yaml file. Looking forward to it

  • @harshitgupta3706
    @harshitgupta3706 2 місяці тому

    Really amazing! Great demonstration. Can you please create a video for deploying MLops project using DAB??

    • @pytalista
      @pytalista 2 місяці тому

      Not a data scientist 😀. But I am doing on building a Python Wheel and another on a GitHub Actions CI/CD

  • @Vatsav04
    @Vatsav04 2 місяці тому

    Hey, I am getting an error when i try to run the dbt commands, Could you please help me through this

    • @pytalista
      @pytalista 2 місяці тому

      What is the error ?

    • @Vatsav04
      @Vatsav04 2 місяці тому

      @@pytalista When I try to activate the Virtual environment using Source it is saying that Source is not the CMD word

    • @pytalista
      @pytalista 2 місяці тому

      @Vatsav04 on my video I am using Linux. Looks like you are on a Windows machine. It is a bit different. myenv\Scripts\activate

  • @rajaspawaskar1644
    @rajaspawaskar1644 2 місяці тому

    I installed third party library and was able to test locally where it worked successfully. When I deployed to Azure, I received the API URL, but when testing this URL, I get '500 internal server error'. I also tried adding the libraries to requirement.txt after using pip freeze. Also tried with v1 and v2 both models, but still facing the same error. Could you please guide me where could I have gone wrong or missed something.

    • @pytalista
      @pytalista 2 місяці тому

      Hard to tell without seeing the error. One possible thing is if you are using any environment variables. Then you need to have on your cloud environment to work.

  • @szymondybczak7336
    @szymondybczak7336 2 місяці тому

    Really cool video. Gave me overall understanding of DABs. Thanks!

    • @pytalista
      @pytalista 2 місяці тому

      Glad you liked 👍🏻

  • @rudrasingh2850
    @rudrasingh2850 2 місяці тому

    hey, its really great video. can we achieve blob trigger using service principle?? if data get loaded in blob storage our blob trigger will run..

    • @pytalista
      @pytalista 2 місяці тому

      A blob trigger is an event driven action. So whenever a blob arrive at a particular location a function will be triggered. Service principle are to grant access to resources. So a blob trigger is independent of service principle. Seu you have a service principal that you want access to your storage account you can do that for sure. 👍🏻

  • @GenAIML_senseNsimplicity
    @GenAIML_senseNsimplicity 2 місяці тому

    Tutorial is very good but the background music is pretty loud and annoying.

    • @pytalista
      @pytalista 2 місяці тому

      Sure 👍🏻 I am working on a version without music.

    • @pytalista
      @pytalista 2 місяці тому

      No problem here is a version without music. Please like and subscribe. ua-cam.com/video/wUqohFhYHl0/v-deo.html

  • @meeta890
    @meeta890 2 місяці тому

    This tutorial is exactly what I was looking for. Thanks for sharing!

    • @pytalista
      @pytalista 2 місяці тому

      Glad it helped.

    • @meeta890
      @meeta890 2 місяці тому

      @@pytalista Please also upload a video of how to store API keys and scerets in a secret manager and use that in Databricks notebook!

    • @pytalista
      @pytalista 2 місяці тому

      If you have an azure account you can use this trick ua-cam.com/video/alZWTehsTcg/v-deo.htmlsi=KGdbe5Y_8igGtDKS

  • @vladimirshapran7286
    @vladimirshapran7286 2 місяці тому

    Great video, the only problem I have, after I have deployed my app it doesn't respond to new file uploaded to the blob

    • @pytalista
      @pytalista 2 місяці тому

      Have tested locally ? Did it work?

  • @nilesh8595
    @nilesh8595 2 місяці тому

    Hello, How to pass parameterized value than static one you passed

    • @pytalista
      @pytalista 2 місяці тому

      Create a parameter in DF

  • @smiley3239
    @smiley3239 2 місяці тому

    love this project!!! thanks for sharing 👍 😊

    • @pytalista
      @pytalista 2 місяці тому

      Welcome 🙏🏻

  • @smiley3239
    @smiley3239 2 місяці тому

    love this project!!! thanks for sharing

    • @pytalista
      @pytalista 2 місяці тому

      Glad you liked 😀

  • @gcunha
    @gcunha 2 місяці тому

    thanks!

  • @fitchmultz
    @fitchmultz 2 місяці тому

    The music is too loud in the background

    • @pytalista
      @pytalista 2 місяці тому

      Thanks for the feedback. Please like and subscribe

  • @marimuthukalyanasundram3151
    @marimuthukalyanasundram3151 2 місяці тому

    It's a nice explanation.

    • @pytalista
      @pytalista 2 місяці тому

      Thanks 🙏🏻

  • @ShawnEary
    @ShawnEary 3 місяці тому

    Cool. In future, please create a post on how to run/debug ipynb files cell by cell using the Databricks Connect V2 plugin.

    • @pytalista
      @pytalista 2 місяці тому

      Cool. Thanks for the tip 😉

  • @SAHITHTHATIPALLI
    @SAHITHTHATIPALLI 3 місяці тому

    Hi, I am facing a small issue. I do want to explicitly mention the container name in the code. could you please help me with that..?

    • @pytalista
      @pytalista 3 місяці тому

      Hi, if you want explicitly is easy. Just put the full blob name. I uploaded the code in the repository github.com/pedrojunqueira/PytalistaYT/blob/master/Python/storage_trigger/function_app.py

    • @SAHITHTHATIPALLI
      @SAHITHTHATIPALLI 3 місяці тому

      @@pytalista i do not want to explicitly memtion it. Currently I am using environmental variables.

    • @pytalista
      @pytalista 3 місяці тому

      @@SAHITHTHATIPALLI Ok good.

  • @roxanedebruyker2591
    @roxanedebruyker2591 3 місяці тому

    Thanks for the video, very nicely done! It did not work at first for me because I had to specify in my local.settings.json the connection string to AzureWebJobsStorage. Also, the path in function_app did not require the storage account name but only the blob container name. Which ended up to be "blob_container_name/{name}" (I wanted my function to get triggered no matter if I would receive a csv or not; otherwise it would have been "blob_container_name/{name}.csv"

    • @pytalista
      @pytalista 3 місяці тому

      Well done Roxane. Great it worked in the end. Life of a programmer 🧑‍💻😀