Microsoft Fabric End to End Ecommerce Project - Building Medallion Architecture in Lakehouse

Поділитися
Вставка
  • Опубліковано 25 гру 2024

КОМЕНТАРІ • 71

  • @hariharanduraisamy4968
    @hariharanduraisamy4968 12 днів тому

    GREAT VIDEO PLEASE CONTINUE MORE

  • @kiranj8769
    @kiranj8769 3 місяці тому

    Excellent, great video..🎉

  • @RiseHigh1988
    @RiseHigh1988 10 місяців тому +1

    Good explanation 🙏🙏.

  • @narendrareddy3576
    @narendrareddy3576 10 місяців тому

    great one bro, keep uploading real time scenarios.

    • @DataVerse_Academy
      @DataVerse_Academy  10 місяців тому

      Sure, Thank you!

    • @narendrareddy3576
      @narendrareddy3576 10 місяців тому +1

      Thanks for the reply@rse_Academy , how do we get the input file name in fabric. I am trying with input fine name function but it's writing blank values.

  • @yveshermann
    @yveshermann 7 місяців тому

    this guy is a champion!! Thanks so much :):)

  • @sonyvijai
    @sonyvijai 8 місяців тому

    Great Video. Crisp and clear

  • @gopalammanikantarao593
    @gopalammanikantarao593 3 місяці тому

    Nice Video, It's helped me to understand M.Fabric flow.

  • @Shreekanthsharma-t6x
    @Shreekanthsharma-t6x 6 місяців тому

    this is great video. thanks

  • @ROHITKUMARGUJAR-k8t
    @ROHITKUMARGUJAR-k8t 7 місяців тому

    Very good explanation.

  • @priyanthakarunathilake8030
    @priyanthakarunathilake8030 4 місяці тому

    Really helpful. Thanks.

  • @KaraokeVN2005
    @KaraokeVN2005 8 місяців тому

    Great video, Thank you so much!!!!!!!!!!

  • @charles-sambo
    @charles-sambo 18 днів тому +1

    Hi i seem to get this error when i run the query after creating the bronze view. Max iterations (20000) reached for batch Resolution, please set 'spark.sql.analyzer.maxIterations' to a larger value. Any help

  • @rahuldhankhar5346
    @rahuldhankhar5346 9 місяців тому +2

    Hey Vishnu, This was a great explanation for real time scenario. I have a scenario where i wanna recursively read multiple files and create a combined data frame before the bronze load. I tried different spark examples but not successful so far. Please let me know if you come across a solution on this. Thanks.

    • @DataVerse_Academy
      @DataVerse_Academy  9 місяців тому

      Sure, I will have a look.

    • @rahuldhankhar5346
      @rahuldhankhar5346 9 місяців тому

      @@DataVerse_Academy Btw this problem applies to excel as input with multiple sheets. For other file types such as json, parquet, csv etc. we can utilize the solution below:
      df = spark.read.load('abfss path',
      format='csv',
      header=True,
      recursiveFileLookup=True
      )
      display(df.limit(10))

  • @IEYdel
    @IEYdel 7 місяців тому +1

    Super helpful! Do you have a video that shows the silver layer with an example of joining related data from heterogenous data sources with data cleansing and deduplication? :D Still you are my hero Vishnu! Thank you for this video!

  • @Mediatube_Tech
    @Mediatube_Tech 16 днів тому

    sir when i have ecommerce data in different domains and different system then how i can import.

  • @sanishthomas2858
    @sanishthomas2858 7 місяців тому

    Nice. quick question, the Presentation slide shown for Architecture is Power Point or any other software?

  • @Shreekanthsharma-t6x
    @Shreekanthsharma-t6x 6 місяців тому

    Hi ,
    I have some complex "Scalar user defined functions" defined in MYSQL and I have to migrate them to fabric, but as of now fabric doesn't support creation of "Scalar user defined functions" in warehouse. In this scenario please let me know alternative options I can use.
    Thanks

    • @DataVerse_Academy
      @DataVerse_Academy  6 місяців тому

      you can build that logic inside the procedure. I know you will not able to return a value using a function, but you can build whatever the logic which you are trying to build.
      If you can give me context, then i will provide you the code as well

  • @omkarkulkarni3346
    @omkarkulkarni3346 10 місяців тому

    Nice video ,my question is all these things you showcased here can be done Azure Synapse then why Fabric considered here,is there anything synapse cant do here ? Whats striking difference that business should consider to use Fabric as the front runner ahead synapse in future?

    • @DataVerse_Academy
      @DataVerse_Academy  10 місяців тому +2

      In Microsoft fabric, everything is available at one place. You don’t need to create separate things. And one of most important thing is one lake, where everything is integrated. If you have multiple departments, for moving data from one department to another department you don’t need to create pipelines, by just providing access you can get the data.

    • @omkarkulkarni3346
      @omkarkulkarni3346 10 місяців тому

      Can you elaborate a little more in comparison with Azure synapse

  • @azobensadio260
    @azobensadio260 10 місяців тому

    Thanks for the real time scenarios , but at no point did you present the gold_product and the file isn't in the shared folder. thanks for your feedback.

    • @DataVerse_Academy
      @DataVerse_Academy  10 місяців тому +2

      For Gold_Prodcut- check the video “at 54:34 Loading Product Dimension - Gold Layer”.

    • @DataVerse_Academy
      @DataVerse_Academy  10 місяців тому

      For files, please check the link in the description. Download the whole folder.

  • @tv.TheDogFather
    @tv.TheDogFather 5 місяців тому

    Thanks for the video...
    Gold_Product is still not included in the Code zip file.
    Can you please include it?
    Not as Important, but at the same time, can you include the Run_Load notebook?

  • @priyankaparida321
    @priyankaparida321 5 місяців тому

    Hello Sir,
    After line no. 23 it is directly showing line no.77 .the middle part is skipped so not getting the code in between that. can you help with it.

  • @AnisurRahman-wm2ys
    @AnisurRahman-wm2ys 7 місяців тому

    Excellent !!!!! Do you have this type of video for SCD2 ?

    • @tv.TheDogFather
      @tv.TheDogFather 5 місяців тому

      I think for the DIM Merges just wrap the merge inside an Insert Into and change the Update of the Merge Accordingly.

  • @John.Wick.221
    @John.Wick.221 7 місяців тому

    Where can I get more such data source

  • @alwaysbehappy1337
    @alwaysbehappy1337 17 днів тому

    Hi, Vishnu, This was a great video, I am getting error while using * after Sales.
    FileNotFoundError: [Errno 2] No such file or directory:
    Please help me on this.

  • @zohadaditto1307
    @zohadaditto1307 2 місяці тому

    sir,cant this project be done in free version of microsoft fabric?

  • @ADhuidv
    @ADhuidv 6 місяців тому

    Sir,
    How can we build the JDBC/Pyodbc connection between Fabric Data warehouse and Fabric Notebook.
    I have been finding it since a long time, but un-successful

    • @DataVerse_Academy
      @DataVerse_Academy  6 місяців тому

      But why do you need it, what is the use case which you are trying to implement?

    • @ADhuidv
      @ADhuidv 6 місяців тому

      1.Initially, we are getting data from multiple sources and sinking them in one warehouse (Raw Data).
      2. Now we want to extract data from this warehouse (Raw Data) to another warehouse (Transformed Data) through a Notebook wherein we will be performing our transformation logic.
      Hence, I want to build the connection between warehouse and Notebook only using JDBC or Pyodbc

  • @vinaypratapsingh5815
    @vinaypratapsingh5815 4 місяці тому

    When i am clicking on New Semantic Model , I am not able to see all those tables to select a table or all tables .
    Because of that i am not able to create Semantic Model.
    Could you please help me here ?
    Thanks

    • @DataVerse_Academy
      @DataVerse_Academy  4 місяці тому

      Whats the error you are getting ?

    • @vinaypratapsingh5815
      @vinaypratapsingh5815 4 місяці тому

      @@DataVerse_Academy Thanks for your response.
      I am not getting any error, but I am not able to select any table to create my semantic model
      . Under Select all , it's not giving me tables name to select the table name

    • @DataVerse_Academy
      @DataVerse_Academy  4 місяці тому

      Please try below once
      Settings- admin portal- > tenant settings - > information protection -> allow users to apply sensitivity labels for content - enable this,
      Then you will be able to create semantic model through lakehouse

    • @julies5085
      @julies5085 2 місяці тому

      @@vinaypratapsingh5815 Hi, I am facing the same issue, only 3 tables shown for selection , others not shown to be used for semantic model creation. Were you able to solve this problem?

  • @ADhuidv
    @ADhuidv 5 місяців тому

    Hello sir,
    Thank you so much providing these productive videos.
    Today, I faced a challenge, and the solution I couldn't find elsewhere.
    That is
    How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?

  • @pragatisharma6036
    @pragatisharma6036 5 місяців тому

    product script is missing in data code file please upload it

  • @sularaperera2719
    @sularaperera2719 8 місяців тому

    Hi Vishnu, when creating a Semantic Model - Fabric gives an error saying "Unexpected error dispatching create semantic model to portal" do you have any ideas why? Thanks

    • @DataVerse_Academy
      @DataVerse_Academy  8 місяців тому

      Do you have all the required access to create a semantic model in workspace?

    • @sularaperera2719
      @sularaperera2719 8 місяців тому

      @@DataVerse_Academy Yes, I have Trial free account. Had two workspaces for pyspark training sessions and one your project. I had created a semantic model before when I did tutorials. but no luck on creating a semantic model with your project. 😒

    • @DataVerse_Academy
      @DataVerse_Academy  8 місяців тому

      Solution for this is, open Sql analytics endpoint instead of lakehouse, then you will be able to create the model.
      Microsoft have just recently changed some settings.

    • @DataVerse_Academy
      @DataVerse_Academy  8 місяців тому

      Another solution,
      Settings- admin portal- > tenant settings - > information protection -> allow users to apply sensitivity labels for content - enable this,
      Then you will be able to create semantic model through lakehouse

  • @Shreekanthsharma-t6x
    @Shreekanthsharma-t6x 6 місяців тому

    I have a SQL server stored procedure which updates, deletes and merges data into a table , how do I convert the stored procedure to pyspark job, is it possible to update a table in fabric using pyspark?, please make a video on this topic

    • @DataVerse_Academy
      @DataVerse_Academy  6 місяців тому +1

      It’s very easy to do the same thing in pyspark, we can do all the stuff which you mentioned. I am a on break for couple of months. I am going to start creating video very soon.

    • @Shreekanthsharma-t6x
      @Shreekanthsharma-t6x 6 місяців тому

      @@DataVerse_Academy please do create a video when you are back from break. Thanks

  • @longphamminh5804
    @longphamminh5804 7 місяців тому

    why did you create two folder "current" and "archive" in Files

    • @DataVerse_Academy
      @DataVerse_Academy  7 місяців тому +1

      To archive the processed file from current to archive folder.

    • @longphamminh5804
      @longphamminh5804 7 місяців тому

      Thank you for answer

    • @longphamminh5804
      @longphamminh5804 7 місяців тому

      @@DataVerse_Academy I have one more question: When to use spark.read.table() and spark.sql

  • @cargouvu
    @cargouvu 10 місяців тому

    Where can I get the files from to follow along.

  • @meradbhima3951
    @meradbhima3951 8 місяців тому

    Hey , This is great Video but I am not able to open the code files you have given. I am missing the code you have used

  • @CapitanFlintt
    @CapitanFlintt 9 місяців тому

    Hello, thanks a lot for the tutorial !
    But you just forgot to upload Gold_Product code in the zip file, can you upload it? Thanks

  • @lakshminathreddyyaddula
    @lakshminathreddyyaddula 8 місяців тому

    Is this completely free course