06. Azure using Python SDK : Azure Blob Trigger Function in Action

Поділитися
Вставка

КОМЕНТАРІ • 19

  • @TechyTacos
    @TechyTacos  10 місяців тому +1

    Note : When your function app runs in the default Consumption plan, there may be a delay of up to several minutes between the blob being added or updated and the function being triggered. If you need low latency in your blob triggered functions, consider running your function app in an App Service plan.
    Please refer this : learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function

  • @1982Dibya
    @1982Dibya 2 місяці тому

    Awesome video you explained very well .Hats off

  • @ammadkhan4687
    @ammadkhan4687 7 місяців тому +1

    your whole Playlist is very useful and comprehensive to understand. I am new to Azure Infrastructure trying to learn it. I have two questions if you could answer I would really appreciate. 1. I have a complete program created in python which will generate at the end pdfs. for this I am using one regex reference file which is basically a json file. my idea is to upload my script in function app with the reference json file for regex and call this function app whenever I will have a queue Item listed. The output of the function later I want to use in logic app. 2. do I have to create service principal app? and can I use one Service Principle app for all the functionalities I need? for example for queue, blob, and function app?... I thank you once again for your effort. I know it is really worthful and have much effort to create such a content. Best regards, Ammad

    • @TechyTacos
      @TechyTacos  7 місяців тому

      Thank you for your kind words !
      1)You can definitely upload your Python script along with the JSON reference file to an Azure Function App. The function can be triggered whenever there is a new item in the queue. To read the JSON file in your function, you can use Python’s built-in json module. Once the function is triggered and the PDFs are generated, you can use the output in a Logic App.
      2)A Service Principal is an application within Azure Active Directory whose authentication tokens can be used as the client_id, client_secret, and tenantId during Azure service authentication. You can surely go ahead and use it but It's not mandatory to create SPA as there are other authentication/authorisation mechanism as well like DefaultAzureCredentials etc..You can use a single Service Principal for multiple functionalities like queue, blob, and function app as long as it has the appropriate permissions. The key thing here is that your Service Principal must be assigned appropriate RBAC roles.

  • @rudrasingh2850
    @rudrasingh2850 2 місяці тому

    HI, Can we achieve when data get added in blob storage then blob trigger will active in azure function using service principle??? without using connection string

  • @thriller718
    @thriller718 20 днів тому

    Hey nice video. However I am getting "stream too long" error when I uploaded a 2.3GiB file. Do you have an idea if blob triggered functions gave a size limit for the files?

    • @TechyTacos
      @TechyTacos  20 днів тому +1

      The “stream too long” error in Azure Blob Functions typically occurs when the file size exceeds the memory limits of the function. Azure Blob triggered functions do not have a specific size limit for the files they can process, but they do have memory constraints that can cause issues with very large files.
      Possible options : Chunking of the file,
      Try to utilize Durable Azure Function - can handle long running processes,
      can try to increase the timeout and see if this works (learn.microsoft.com/en-gb/answers/questions/1433195/how-to-resolve-azure-function-stream-too-long-erro)
      One more possible option, Instead of using a Blob trigger, you can use Azure Event Grid to trigger the function and then use the Azure Storage SDK to read the file in chunks.

    • @thriller718
      @thriller718 19 днів тому

      @@TechyTacos Thanks for the reply. It looks like it's trying to load the whole file into memory and there are constraints on the size and it crashes before I could do anything with the file. What I ended up doing is - upload the file to blob storage from the frontend, then trigger an http function passing the filename in the body. The http function downloads the file using the sdk from the blob and do further operation. Is there anything else I should be keeping in mind with this approach? Cheers

  • @shreyaroraa2234
    @shreyaroraa2234 8 місяців тому +1

    Future video idea - Create a video on v2 also

    • @TechyTacos
      @TechyTacos  8 місяців тому

      Sure. Thank you for the suggestion !

    • @TechyTacos
      @TechyTacos  6 місяців тому

      v2 video is live now ! ua-cam.com/video/OIk3NXxIg9E/v-deo.html

  • @saurabhjain507
    @saurabhjain507 8 місяців тому

    why model v1 and not the recent v2?

    • @TechyTacos
      @TechyTacos  8 місяців тому

      No problem. I personally haven't explored v2 much.

    • @TechyTacos
      @TechyTacos  6 місяців тому

      v2 video is live now. ua-cam.com/video/OIk3NXxIg9E/v-deo.html

  • @xyx4641
    @xyx4641 6 місяців тому

    How u made function.json

    • @TechyTacos
      @TechyTacos  6 місяців тому

      function.json will be automatically created as soon as you complete all the config steps. We don't need to create it explicitly.

    • @xyx4641
      @xyx4641 6 місяців тому

      @@TechyTacos not getting created

    • @xyx4641
      @xyx4641 6 місяців тому

      Can u help me i need urgent help its a prod thing