Simple File API Using Azure Blob Storage (running locally)

Поділитися
Вставка
  • Опубліковано 29 вер 2024
  • ☄️ Master the Modular Monolith Architecture: bit.ly/3SXlzSt
    📌 Accelerate your Clean Architecture skills: bit.ly/3PupkOJ
    🚀 Support me on Patreon to access the source code: / milanjovanovic
    How can you build a simple file API on Azure? You can use the Azure Blob Storage service. It comes with a pre-built SDK, which is available as a NuGet package. In this video, I'll show you how to run Azure Blob Storage locally in a Docker container. Then, we will build a simple file API to upload, download, and delete files from Azure Blob Storage.
    Check out my courses: bit.ly/3PupkOJ
    Join my weekly .NET newsletter:
    www.milanjovan...
    Read my Blog here:
    www.milanjovan...
    Chapters
    0:00 How

КОМЕНТАРІ • 37

  • @MilanJovanovicTech
    @MilanJovanovicTech  5 місяців тому +2

    Want to master Clean Architecture? Go here: bit.ly/3PupkOJ
    Want to unlock Modular Monoliths? Go here: bit.ly/3SXlzSt

  • @user-re6bu7dy1l
    @user-re6bu7dy1l 5 місяців тому +3

    Decent video, but it doesn't stand out except local emulator. What's really interesting and what I was lookig for when came across the video is showing how to deal with large files upload/download and also how can we make uploading to the blob storage and db transactions (e.g persisting file info) atomic. Maybe, you could cover more in depth work with files and blob storage?

    • @MilanJovanovicTech
      @MilanJovanovicTech  5 місяців тому

      What would you consider a large file?
      You can't make it atomic, theoretically. Any one service in a distributed system can fail.
      What you can do is implement appropriate compensating actions, based on what failed.
      Another option I would consider is being "eventually consistent". Let's say, upload to Blob Storage and subscribe to the upload event, to write some stuff to the DB.

    • @user-re6bu7dy1l
      @user-re6bu7dy1l 4 місяці тому

      @@MilanJovanovicTech
      -What would you consider a large file?
      I would consider something larger than MaxAllowedContentLength default a large file or a file that exeeds maximum block size for single upload operation (which would require multipart upload).
      - What you can do is implement appropriate compensating actions, based on what failed.
      This is a advanced level content type I'm looking for

  • @OgnjenBokic-c7e
    @OgnjenBokic-c7e 4 місяці тому

    Great content.
    Can this be implemented in a dockerized application on a linux server?

    • @MilanJovanovicTech
      @MilanJovanovicTech  4 місяці тому

      Yes - but I don't think you want to run your own instance of Azurite.

  • @tbalakpm
    @tbalakpm 5 місяців тому +3

    Blog -> Blob

    • @MilanJovanovicTech
      @MilanJovanovicTech  5 місяців тому +1

      Rofl, that autocorrect 🥲😂 Thanks for saving the day!

  • @sunzhang-d9v
    @sunzhang-d9v 5 місяців тому

    Unhappy why not modular content

  • @i3looi2
    @i3looi2 2 місяці тому

    This is bad practice for big files. You simply proxy the file through the SERVER/API instead of directly going to blobstorage.
    If you need to upload GB++ files, you bottleneck bandwith and CPU time for no reason.

    • @MilanJovanovicTech
      @MilanJovanovicTech  2 місяці тому

      There's also pre-signed URL so you can hit Blob storage directly

  • @sreenidhisalugu6498
    @sreenidhisalugu6498 2 місяці тому

    Could you please make a video on sending file from angular and storing it locally in azure storage

  • @harkiratsingh358
    @harkiratsingh358 4 місяці тому

    No need of docker , now it runs within visual studio

  • @AcapellaNutella6
    @AcapellaNutella6 4 місяці тому

    I did all this, but I do not see "Files" in my swagger document???

    • @MilanJovanovicTech
      @MilanJovanovicTech  4 місяці тому +1

      No idea, it should just "work".

    • @AcapellaNutella6
      @AcapellaNutella6 4 місяці тому

      @@MilanJovanovicTech I just used the classic azure function pattern and did it that way. Kinda gave up on the minimalist approach. Good video tho it showed me how to do what I was trying to do.

  • @Ajmal_Yazdani
    @Ajmal_Yazdani 5 місяців тому +1

    Thanks for your share @Milan Jovanović. What should I consider if I have to upload very large (up to 10GB) files?

    • @MilanJovanovicTech
      @MilanJovanovicTech  5 місяців тому +1

      Consider uploading directly to Blob Storage with pre-signed URLs

    • @Ajmal_Yazdani
      @Ajmal_Yazdani 4 місяці тому

      @@MilanJovanovicTech Could you please explain more. I believe we need to chunk the file into smaller size and either upload in parallel or something, but something looks not simple here. Thoughts?

    • @user-re6bu7dy1l
      @user-re6bu7dy1l 4 місяці тому

      @@MilanJovanovicTech What about validation and running business logic? What if need to check allowed upload size for the user?

  • @Dragonet17
    @Dragonet17 5 місяців тому +1

    Arek you going to create course about Azure ?

  • @investycoon-app
    @investycoon-app 4 місяці тому

    Does Azure Blob Storage can handle rights access for files or is it something we have to implement on web application ?

  • @gorgestv6340
    @gorgestv6340 5 місяців тому

    In my practice project, I use a "Cloudinary" ( I think equivalent of Blob) and I save files in this storage and in database I save only paths to this files. What do you mean about this aproach?

  • @regestea
    @regestea 5 місяців тому

    Thanks for your great contents, I think we need a content to "How do I write a test for Azure blob service"

  • @ferventurart
    @ferventurart 4 місяці тому

    Great video Milan!

  • @yunietpiloto4425
    @yunietpiloto4425 5 місяців тому

    Great content as always. Thanks for sharing!

  • @itirush2701
    @itirush2701 5 місяців тому

    Please make video microservice 🙏