- 51
- 141 048
DataVerse Academy
India
Приєднався 27 лип 2023
Welcome to Dataverse Academy, your ultimate destination for everything related to Microsoft Fabric, SQL, Power BI, Azure, Microsoft Fabric, Spark, Python, and more! If you're passionate about data engineering and data science or eager to embark on a journey of data exploration, you've come to the right place.
What to Expect:
On this channel, we delve into the fascinating world of data engineering, where we bring you in-depth tutorials, insightful discussions, and hands-on demonstrations. Whether you're a beginner looking to grasp the fundamentals or an experienced data enthusiast seeking advanced insights, our content caters to all skill levels.
About the Host:
I am Vishnu Chaudhary, a seasoned data engineer with a passion for teaching and sharing knowledge. With 7+ years of hands-on experience in the industry, I'm excited to guide you through the intricacies of data engineering and data science concepts.
What to Expect:
On this channel, we delve into the fascinating world of data engineering, where we bring you in-depth tutorials, insightful discussions, and hands-on demonstrations. Whether you're a beginner looking to grasp the fundamentals or an experienced data enthusiast seeking advanced insights, our content caters to all skill levels.
About the Host:
I am Vishnu Chaudhary, a seasoned data engineer with a passion for teaching and sharing knowledge. With 7+ years of hands-on experience in the industry, I'm excited to guide you through the intricacies of data engineering and data science concepts.
Microsoft Fabric Spark Notebook - Learn PySpark and SparkSQL in 2hr(Beginners Course) #microsoft
Microsoft Fabric End to End Solution - PySpark and SparkSQL Full Course For Beginners
Note:- Download the whole folder
Dataset and Code Link- 1drv.ms/f/s!Ak57yG1_Qg0iey_2DKTSnObu6Ts?e=QyMSNr
----- Video Outline -----
0:01 Introduction
0:15 Loading File From You Local System to Lakehouse
1:40 Reading CSV File Using PySpark in Microsoft Fabric Notebook
4:02 Select and SelectExpr Function in PySpark
7:30 WithCoumn, WithColumns and WithColumnRenamed Function in PySpark
13:02 Where and Filter Function in PySpark
15:38 GroupBy and Aggregation Function in PySpark
19:45 Joining Two Dataframes Using PySpark
25:34 Union and UnionByName in PySpark
32:06 Drop, DropDuplicates and Distinct Function in PySpark
40:17 Write/Load data to a delta Table and Parquet File Using PySpark
45:16 Overwrite, Append , Ignore and Error Write Mode in PySpark
53:40 Create Delta Table and Load Data through Dataframe in PySpark
01:00:34 Create Delta Table with Partition and Load Data into it Using Dataframe in PySpark
01:10:57 Difference Between Managed and External Table
01:19:34 Create delta table using SparkSQL
01:23:15 Create delta table with Partition using SparkSQL and Insert Data from a View
01:28:48 DML Operations over a delta table using SparkSQL
01:34:17 Temporary and Global Reporary View in PySpark
01:38:32 Data Transformation Over a delta table using SparkSQL
01:50:01 Reading Delta table using PySpark and SparkSQL
Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html
Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html
If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos-
1. Create Microsoft Fabric Data warehouse: ua-cam.com/video/eFSrun_IhtA/v-deo.html
2. load data into Microsoft Fabric data warehouse using Data Pipeline: ua-cam.com/video/Lk_Tt6qrDH0/v-deo.html
3. Create table in Microsoft Fabric data warehouse: ua-cam.com/video/WkVbKcZjgHU/v-deo.html
4. load data into Microsoft Fabric data warehouse using Copy into Command: ua-cam.com/video/WkVbKcZjgHU/v-deo.html
5. Transform and load data into Microsoft Fabric data warehouse using T-SQL Procedure - ua-cam.com/video/Ew86sP4PZWw/v-deo.html
Note:- Download the whole folder
Dataset and Code Link- 1drv.ms/f/s!Ak57yG1_Qg0iey_2DKTSnObu6Ts?e=QyMSNr
----- Video Outline -----
0:01 Introduction
0:15 Loading File From You Local System to Lakehouse
1:40 Reading CSV File Using PySpark in Microsoft Fabric Notebook
4:02 Select and SelectExpr Function in PySpark
7:30 WithCoumn, WithColumns and WithColumnRenamed Function in PySpark
13:02 Where and Filter Function in PySpark
15:38 GroupBy and Aggregation Function in PySpark
19:45 Joining Two Dataframes Using PySpark
25:34 Union and UnionByName in PySpark
32:06 Drop, DropDuplicates and Distinct Function in PySpark
40:17 Write/Load data to a delta Table and Parquet File Using PySpark
45:16 Overwrite, Append , Ignore and Error Write Mode in PySpark
53:40 Create Delta Table and Load Data through Dataframe in PySpark
01:00:34 Create Delta Table with Partition and Load Data into it Using Dataframe in PySpark
01:10:57 Difference Between Managed and External Table
01:19:34 Create delta table using SparkSQL
01:23:15 Create delta table with Partition using SparkSQL and Insert Data from a View
01:28:48 DML Operations over a delta table using SparkSQL
01:34:17 Temporary and Global Reporary View in PySpark
01:38:32 Data Transformation Over a delta table using SparkSQL
01:50:01 Reading Delta table using PySpark and SparkSQL
Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html
Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html
If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos-
1. Create Microsoft Fabric Data warehouse: ua-cam.com/video/eFSrun_IhtA/v-deo.html
2. load data into Microsoft Fabric data warehouse using Data Pipeline: ua-cam.com/video/Lk_Tt6qrDH0/v-deo.html
3. Create table in Microsoft Fabric data warehouse: ua-cam.com/video/WkVbKcZjgHU/v-deo.html
4. load data into Microsoft Fabric data warehouse using Copy into Command: ua-cam.com/video/WkVbKcZjgHU/v-deo.html
5. Transform and load data into Microsoft Fabric data warehouse using T-SQL Procedure - ua-cam.com/video/Ew86sP4PZWw/v-deo.html
Переглядів: 5 599
Відео
Microsoft Fabric End to End Ecommerce Project - Building Medallion Architecture in Lakehouse #ai
Переглядів 27 тис.10 місяців тому
Microsoft Fabric End to End Solution - Building Medallion Architecture in Lakehouse #microsoft #microsoftfabric #azure #lakehouse #dataengineering #dataengineering #modelling #facts #dimesions #technology #ai #chatgpt #india Note: Download the whole folder Dataset and Code Link- 1drv.ms/f/s!Ak57yG1_Qg0iey_2DKTSnObu6Ts?e=QyMSNr Project chapters 0:01 Project Overview - Microsoft fabric lakehouse ...
Microsoft Fabric Lakehouse: What is semantic models and how to connect to it through PowerBI
Переглядів 12 тис.10 місяців тому
Semantic Model in Microsoft Fabric Lakehouse What is SQL end point in lakehouse What is default semantic model in lakehouse Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos- 1...
Microsoft Fabric Notebook: Difference Between External and Managed Table #Microsoftfabrc #azure
Переглядів 1,9 тис.10 місяців тому
Microsoft Fabric PowerBI | Lakehouse: How to create External Table, Managed Table and Parquet Table Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos- 1. Create Microsoft Fabri...
Microsoft Fabric Notebook: What is the difference between Save and SaveAsTable #Microsoft #azure
Переглядів 1,1 тис.10 місяців тому
Microsoft Fabric Lakehouse: How to create External Table, Managed Table and Parquet Table Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos- 1. Create Microsoft Fabric Data war...
Microsoft Fabric Lakehouse: How to Connect to ADLS Gen2 Using Shortcut in Lakehouse
Переглядів 2,5 тис.10 місяців тому
Microsoft Fabric Shortcut: How to Connect to ADLS Gen2 Using Shortcut in Lakehouse Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos- 1. Create Microsoft Fabric Data warehouse:...
2 Microsoft Fabric DataflowGen2: Incremental load from Azure SQL DB to Warehouse using DataflowGen2
Переглядів 2,1 тис.Рік тому
Microsoft Fabric Tutorial Series- How to do Incremental Load Using from Azure SQL database to Warehouse using Dataflow Gen2 Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos- 1...
11 Microsoft Fabric Data Pipeline: How to do Incremental Load Using Copy Activity in data pipeline
Переглядів 7 тис.Рік тому
Microsoft Fabric Tutorial Series- How to do Incremental Load Using Copy Activity in Microsoft Fabric data pipeline. Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos- 1. Create...
10 Microsoft Fabric Data Pipeline: Load multiple tables from azure SQL database to Warehouse
Переглядів 3,1 тис.Рік тому
Microsoft Fabric Tutorial Series- How to load multiple tables from azure SQL database to Warehouse using single copy activity. Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Please watch below videos...
03 Data Modelling: OLTP vs OLAP (Data Warehouse Architecture) System in detail #datawarehouse
Переглядів 408Рік тому
Difference between OLTP vs OLAP (Data Warehouse) System | Online Transaction Processing | Online Analytical Processing | Data Warehouse Architecture | OLAP Cube Data Modelling Tutorial: ER Diagram in DBMS | Entity Relationship Diagram | ER | How to draw an ER Diagram in DBMS | How to make an ER Diagram Description: Welcome to the "Mastering Data Modeling" playlist! In this series, we dive deep ...
02 Data Modelling: What is Conceptual, Logical and Physical Data Model in Detail?
Переглядів 1,7 тис.Рік тому
Data Modelling Tutorial: What is Conceptual, Logical and Physical Data Model in Detail? | Data Mart |Data Modelling |Data Warehouse |OLAP and OLTP System |ER Diagram in DBMS | Entity Relationship Diagram | ER | How to draw an ER Diagram in DBMS | How to make an ER Diagram Description: Welcome to the "Mastering Data Modeling" playlist! In this series, we dive deep into the world of data modeling...
01 Data Modelling Tutorial: ER (Entity Relationship) Diagram in detail
Переглядів 1 тис.Рік тому
Data Modelling Tutorial: ER Diagram in DBMS | Entity Relationship Diagram | ER | How to draw an ER Diagram in DBMS | How to make an ER Diagram Description: Welcome to the "Mastering Data Modeling" playlist! In this series, we dive deep into the world of data modeling, exploring everything from the fundamentals to advanced techniques. Whether you're a data analyst, data scientist, database admin...
Microsoft Fabric KQL:How to read data from kafka Realtime stream and load data into the KQL Database
Переглядів 1,3 тис.Рік тому
PowerBI Microsoft Fabric Tutorial:How to read data from kafka Realtime stream and load data into the KQL Database. | Microsoft Fabric RealTime Analytics | Microsoft Fabric KQL Database | Microsoft Fabric KQL Query set | Microsoft Fabric Event Streaming Microsoft fabric tutorial Series: How to read data from an event stream and load data into the KQL Database. Microsoft fabric data pipeline scri...
1 Microsoft Fabric Dataflow Gen2: How to read data from SharePoint Site and load data into Lakehouse
Переглядів 6 тис.Рік тому
dataflow gen2 | Microsoft Fabric Data Pipeline - How to read data from SharePoint Site using Microsoft Fabric Dataflow Gen2 and load data into the Lakehouse Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Wareho...
09 Microsoft Fabric Data Pipeline: How to use procedure activity in Microsoft Fabric Data Pipeline
Переглядів 1,4 тис.Рік тому
Microsoft Fabric Tutorial Series- How to use procedure activity in Microsoft Fabric Data Pipeline to drop table in Microsoft Fabric Data warehouse. Microsoft fabric data pipeline script activity: ua-cam.com/video/gxoAWCtP07I/v-deo.html Microsoft Fabric data Lakehouse end to end project - ua-cam.com/video/gKdlsHm7QgU/v-deo.html If you want to know more about Microsoft Fabric data Warehouse, Plea...
005 Microsoft Fabric KQL: How to read data from an event stream and load data into the lakehouse
Переглядів 1,7 тис.Рік тому
005 Microsoft Fabric KQL: How to read data from an event stream and load data into the lakehouse
004 Microsoft Fabric KQL: How to read data from an event stream and load data into the KQL Database
Переглядів 748Рік тому
004 Microsoft Fabric KQL: How to read data from an event stream and load data into the KQL Database
003 Microsoft Fabric KQL Database: What is KQL Query Set and Create PowerBI Report from KQL Query
Переглядів 953Рік тому
003 Microsoft Fabric KQL Database: What is KQL Query Set and Create PowerBI Report from KQL Query
Microsoft Fabric KQL Database:How to Create KQL database and upload local file into the KQL Database
Переглядів 709Рік тому
Microsoft Fabric KQL Database:How to Create KQL database and upload local file into the KQL Database
001 Microsoft Fabric KQL Database: Realtime Analytics, Event Streaming and KQL Database Overview
Переглядів 964Рік тому
001 Microsoft Fabric KQL Database: Realtime Analytics, Event Streaming and KQL Database Overview
08 Microsoft Fabric Data Pipeline: How to Use Lookup and Foreach Activity in Data Pipeline
Переглядів 4,2 тис.Рік тому
08 Microsoft Fabric Data Pipeline: How to Use Lookup and Foreach Activity in Data Pipeline
07 Microsoft Fabric Data Pipeline: How to use Dataflow gen2 in Microsoft Fabric Data Pipeline
Переглядів 1,3 тис.Рік тому
07 Microsoft Fabric Data Pipeline: How to use Dataflow gen2 in Microsoft Fabric Data Pipeline
002 Microsoft Fabric Tutorial: How to login to Microsoft fabric account using Personal Email
Переглядів 2,3 тис.Рік тому
002 Microsoft Fabric Tutorial: How to login to Microsoft fabric account using Personal Email
009 Microsoft Fabric Data warehouse: How to connect to warehouse through PowerBI Desktop #microsoft
Переглядів 1,6 тис.Рік тому
009 Microsoft Fabric Data warehouse: How to connect to warehouse through PowerBI Desktop #microsoft
008 Microsoft Fabric Lakehouse: How to Connect to Lakehouse through PowerBI Desktop #microsoftfabric
Переглядів 1,5 тис.Рік тому
008 Microsoft Fabric Lakehouse: How to Connect to Lakehouse through PowerBI Desktop #microsoftfabric
06 Microsoft Fabric Data Pipeline: How to use Teams activity in data pipeline #microsoftfabric
Переглядів 710Рік тому
06 Microsoft Fabric Data Pipeline: How to use Teams activity in data pipeline #microsoftfabric
05 Microsoft Fabric Data Pipeline: How to use Outlook activity in data pipeline #microsoftfabric
Переглядів 1,2 тис.Рік тому
05 Microsoft Fabric Data Pipeline: How to use Outlook activity in data pipeline #microsoftfabric
04 Microsoft Fabric Data Pipeline: How to use Get metadata activity of data pipeline #microsoft
Переглядів 1,8 тис.Рік тому
04 Microsoft Fabric Data Pipeline: How to use Get metadata activity of data pipeline #microsoft
03 Microsoft Fabric Data Pipeline: Create schema and table using Script activity in data warehouse
Переглядів 2,9 тис.Рік тому
03 Microsoft Fabric Data Pipeline: Create schema and table using Script activity in data warehouse
03 Microsoft Fabric Dataflow gen2: Load/Get Data from Rest API into Lakehouse Using Dataflow gen2
Переглядів 1,7 тис.Рік тому
03 Microsoft Fabric Dataflow gen2: Load/Get Data from Rest API into Lakehouse Using Dataflow gen2
GREAT VIDEO PLEASE CONTINUE MORE
df_sales= pd.read_excel("abfss://****/Files/Current/Sales*.xlsx",sheet_name="Sales") This is generating error. I am using Trial version of Fabric. This works if I mention complete filename (without any *)
sir when i have ecommerce data in different domains and different system then how i can import.
Hi, Vishnu, This was a great video, I am getting error while using * after Sales. FileNotFoundError: [Errno 2] No such file or directory: Please help me on this.
Hi i seem to get this error when i run the query after creating the bronze view. Max iterations (20000) reached for batch Resolution, please set 'spark.sql.analyzer.maxIterations' to a larger value. Any help
If I want to change the column format, what can I do? It doesn't have a Table View
well explained thanks
Well explained! Thank you!
quick quesiton, in case I decide to use external table, ill be able to see the table created in the SQL analytics endpoint? or just when the table is managed?
I have followed your steps, pipeline is successful but no data is getting loaded. I have additional columns configured in source of copy activity. Only those columns are loaded, all other columns which are coming from source are not getting loaded.
sir,cant this project be done in free version of microsoft fabric?
It is only done in free version.
Next button is disable
🎉amazing
if my file get new data it will also work for the updated data or should i need to creat Agin the data flow
You can implement the incremental load inside the dataflow gen2 as well.
Very informative! thank you
Excellent, great video..🎉
I just finished watching this video on using PySpark in Microsoft Fabric, and it was incredibly helpful! The explanations were clear, and the step-by-step walkthrough made complex concepts easy to understand. This is definitely a must-watch for anyone looking to get started with PySpark in Microsoft Fabric. Great job, and thank you for sharing this valuable content! I highly recommend subscribing to this channel. The content is consistently high-quality, with tutorials that are not only informative but also easy to follow. Whether you're a beginner or an experienced professional, this channel offers valuable insights and tips that can help you stay ahead in the data world. Don't miss out on the great content they’re putting out!
Thank you! 🙏
Nice Video, It's helped me to understand M.Fabric flow.
Thank you! 😊
Can we pass parameters in dataflow gen2
No, it’s not supported yet.
Very very good explanation Thank you
will this work if we delete or update a data in the source table ?
It will work in case of update , not in delete. As it’s incremental approach not CDC.
Hi, how can i edit a table from a semantic model in power bi's power query ?
Can u pls explain how u get connection oath for source over here like we did in ADF
Thanks for your video, very intuitive and impressive. I have a question, if your Table_List table is saved as a delta table from a python script, and i would like to have the Max_value updated.... I believe i will not be able to keep the stored procedure, right? if not should i be replacing it with a notebook activity? Thanks
Yes you are right you will not be able to use stored procedure to update a delta table. You need to create a notebook for that.
how to create new columns and dim date table in this model
When i am clicking on New Semantic Model , I am not able to see all those tables to select a table or all tables . Because of that i am not able to create Semantic Model. Could you please help me here ? Thanks
Whats the error you are getting ?
@@DataVerse_Academy Thanks for your response. I am not getting any error, but I am not able to select any table to create my semantic model . Under Select all , it's not giving me tables name to select the table name
Please try below once Settings- admin portal- > tenant settings - > information protection -> allow users to apply sensitivity labels for content - enable this, Then you will be able to create semantic model through lakehouse
@@vinaypratapsingh5815 Hi, I am facing the same issue, only 3 tables shown for selection , others not shown to be used for semantic model creation. Were you able to solve this problem?
Really helpful. Thanks.
Great explanation, thank you!
For microsoft fabric how the market there is any calls
Not now, but you will a lot of movement towards fabric in upcoming 1-2 year.
Thanks for the video... Gold_Product is still not included in the Code zip file. Can you please include it? Not as Important, but at the same time, can you include the Run_Load notebook?
Not able to access the azure blob storage link you have shared
product script is missing in data code file please upload it
Nice one. Usefulefor me. Thx
Thank you, help me a lot with this short video.
how to read data from a REST API that has authentication tokens (Refresh tokens) into a Fabric lakehouse?
Hello Sir, After line no. 23 it is directly showing line no.77 .the middle part is skipped so not getting the code in between that. can you help with it.
Thank u so much!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Hello sir, Thank you so much providing these productive videos. Today, I faced a challenge, and the solution I couldn't find elsewhere. That is How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?
So I did the same with on prem sql server to the warehouse , and checked every connections are preview working but when I am validating the activity I got this error "Copying data from on-premises connection with staging storage in workspace is not supported. Please use external staging storage instead." what could be the solution?
Is there a way to connect to ADLS directly without creating Shortcut in Lakehouse
Yes you can use the service principal method.
Sir, How can we build the JDBC/Pyodbc connection between Fabric Data warehouse and Fabric Notebook. I have been finding it since a long time, but un-successful
But why do you need it, what is the use case which you are trying to implement?
1.Initially, we are getting data from multiple sources and sinking them in one warehouse (Raw Data). 2. Now we want to extract data from this warehouse (Raw Data) to another warehouse (Transformed Data) through a Notebook wherein we will be performing our transformation logic. Hence, I want to build the connection between warehouse and Notebook only using JDBC or Pyodbc
Hi , I have some complex "Scalar user defined functions" defined in MYSQL and I have to migrate them to fabric, but as of now fabric doesn't support creation of "Scalar user defined functions" in warehouse. In this scenario please let me know alternative options I can use. Thanks
you can build that logic inside the procedure. I know you will not able to return a value using a function, but you can build whatever the logic which you are trying to build. If you can give me context, then i will provide you the code as well
Sir, Can we extract the data directly from warehouse to the notebook, then transform it, and then finally save it to the same warehouse??
You can transform the data inside a notebook from warehouse, but you can’t write the data into the warehouse.
Hi Good Morning!, I have to convert the existing SQL server stored procedure into fabric environment, In my stored procedures there are CURSOR commands but fabric doesn't support CURSOR commands, in this case how do I proceed, is there any alternative.
You can use the while loop for that.
Hi sir, can we use metadata activity instead of look up activity, in order to perform same operation and get same result? Is metadata activity can do the same work as look up activity do?
No, you will not be able to write query inside the meta data activity.
thanks for the great explanation but here i didnt get one thing and that is where the metadata for managed table is getting created. I cant see any files created in Files folder while for external we can see them in the path provided.
for managed tables, metadata and data is inside the tables folder itselft
this is great video. thanks
I have a SQL server stored procedure which updates, deletes and merges data into a table , how do I convert the stored procedure to pyspark job, is it possible to update a table in fabric using pyspark?, please make a video on this topic
It’s very easy to do the same thing in pyspark, we can do all the stuff which you mentioned. I am a on break for couple of months. I am going to start creating video very soon.
@@DataVerse_Academy please do create a video when you are back from break. Thanks
Excellent !!!!! Do you have this type of video for SCD2 ?
I think for the DIM Merges just wrap the merge inside an Insert Into and change the Update of the Merge Accordingly.
this guy is a champion!! Thanks so much :):)