- 26
- 65 812
Infoloomia
India
Приєднався 4 вер 2018
This channel is created for Learning with Laughter. This channel includes all Fashion, Fun, adeventure and learning video series for no age restriction
Challanges before Microsoft Fabric !!
Before the introduction of unified platforms like Microsoft Fabric, traditional data analysis typically involved a more fragmented and siloed approach. Different tools and technologies were used for various parts of the data pipeline, often leading to inefficiencies, complexity, and delays in getting actionable insights. Here’s how traditional data analysis looked before platforms like Microsoft Fabric streamlined the process:
1. Data Collection and Ingestion
Multiple Tools: Different tools were used to collect and ingest data from various sources.
Manual Processes: Data was often manually extracted from databases, external sources (e.g., Excel, CSV files), or APIs
Batch Processing: Data ingestion usually happened in batches (e.g., daily, weekly)
2. Data Storage
Relational Databases: The primary method of data storage was relational databases (RDBMS), such as SQL Server, MySQL, or Oracle.
Data Warehouses: Organizations also used data warehouses like Teradata, Amazon Redshift, or Snowflake for storing large volumes of structured data optimized for reporting and analytics.
Data Silos: Data was stored in isolated silos-spread across multiple databases or warehouses-which made it difficult to unify the data for comprehensive analysis. There was often no central repository (like OneLake in Microsoft Fabric), making data integration challenging.
3. Data Transformation and Preparation
ETL/ELT Tools: Data transformation was handled by separate ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) tools. These tools transformed raw data into a format suitable for analysis, often requiring technical expertise to build and maintain complex pipelines.
Custom Coding: In many cases, organizations relied on custom scripts written in Python, SQL, or Java to clean and transform data. This required highly skilled data engineers, and even small changes could lead to broken workflows or lengthy updates.
Time-Consuming Processes: Data preparation often took a significant amount of time, causing delays in analysis. Teams would spend weeks or months cleaning and transforming data before they could start analyzing it.
4. Data Analytics and Reporting
Multiple Analytics Tools: Analysts used different tools for different types of analysis. For example:
Excel: Widely used for basic data analysis and reporting.
SQL Queries: Analysts ran manual SQL queries against databases to extract insights.
Business Intelligence (BI) Tools: Tools like Tableau, Power BI, QlikView, or SAP BusinessObjects were used for creating visual reports and dashboards. These tools were often disconnected from the data engineering workflow, leading to manual data extraction steps.
Advanced Analytics: For advanced analytics, data scientists might use R, Python, or MATLAB to run complex statistical analyses or machine learning models. However, this required moving data between systems, increasing complexity.
5. Data Sharing and Collaboration
Isolated Workflows: Analysts, data scientists, and business users often worked in silos. Each team would work with their own set of tools, leading to fragmented workflows and duplication of effort.
Manual Reporting: Reports were typically shared via email, PDFs, or PowerPoint slides, making collaboration difficult and prone to versioning issues.
Slow Time-to-Insight: Because the processes were fragmented, from data ingestion to reporting, organizations experienced a long time-to-insight, slowing decision-making.
6. Governance and Security
Disparate Security Models: Since organizations used multiple tools and databases, managing access control and data security was cumbersome. Each system had its own security configurations, increasing the risk of data breaches or unauthorized access.
Manual Data Governance: Data lineage, cataloging, and governance were often manual processes or spread across different tools. This made it hard to track how data was used, where it originated, and whether it complied with regulations like GDPR or HIPAA.
7. Scalability and Maintenance
Infrastructure Constraints: Traditional data platforms were often on-premises, meaning scalability was limited by physical hardware. Scaling up required significant investment in infrastructure, making it difficult for organizations to quickly adapt to growing data needs.
Complex Maintenance: Maintaining the systems-updating software, ensuring hardware capacity, optimizing performance-required dedicated IT teams and added to operational costs.
8. Real-Time Processing
Lack of Real-Time Capabilities: In traditional systems, real-time data processing was rare or difficult to achieve. Most organizations relied on batch processing, which meant reports were often based on outdated data.
Streaming Data Complexity: Handling real-time or streaming data required specialized tools like Apache Kafka or Azure Stream Analytics, adding another layer of complexity to the architecture.
1. Data Collection and Ingestion
Multiple Tools: Different tools were used to collect and ingest data from various sources.
Manual Processes: Data was often manually extracted from databases, external sources (e.g., Excel, CSV files), or APIs
Batch Processing: Data ingestion usually happened in batches (e.g., daily, weekly)
2. Data Storage
Relational Databases: The primary method of data storage was relational databases (RDBMS), such as SQL Server, MySQL, or Oracle.
Data Warehouses: Organizations also used data warehouses like Teradata, Amazon Redshift, or Snowflake for storing large volumes of structured data optimized for reporting and analytics.
Data Silos: Data was stored in isolated silos-spread across multiple databases or warehouses-which made it difficult to unify the data for comprehensive analysis. There was often no central repository (like OneLake in Microsoft Fabric), making data integration challenging.
3. Data Transformation and Preparation
ETL/ELT Tools: Data transformation was handled by separate ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) tools. These tools transformed raw data into a format suitable for analysis, often requiring technical expertise to build and maintain complex pipelines.
Custom Coding: In many cases, organizations relied on custom scripts written in Python, SQL, or Java to clean and transform data. This required highly skilled data engineers, and even small changes could lead to broken workflows or lengthy updates.
Time-Consuming Processes: Data preparation often took a significant amount of time, causing delays in analysis. Teams would spend weeks or months cleaning and transforming data before they could start analyzing it.
4. Data Analytics and Reporting
Multiple Analytics Tools: Analysts used different tools for different types of analysis. For example:
Excel: Widely used for basic data analysis and reporting.
SQL Queries: Analysts ran manual SQL queries against databases to extract insights.
Business Intelligence (BI) Tools: Tools like Tableau, Power BI, QlikView, or SAP BusinessObjects were used for creating visual reports and dashboards. These tools were often disconnected from the data engineering workflow, leading to manual data extraction steps.
Advanced Analytics: For advanced analytics, data scientists might use R, Python, or MATLAB to run complex statistical analyses or machine learning models. However, this required moving data between systems, increasing complexity.
5. Data Sharing and Collaboration
Isolated Workflows: Analysts, data scientists, and business users often worked in silos. Each team would work with their own set of tools, leading to fragmented workflows and duplication of effort.
Manual Reporting: Reports were typically shared via email, PDFs, or PowerPoint slides, making collaboration difficult and prone to versioning issues.
Slow Time-to-Insight: Because the processes were fragmented, from data ingestion to reporting, organizations experienced a long time-to-insight, slowing decision-making.
6. Governance and Security
Disparate Security Models: Since organizations used multiple tools and databases, managing access control and data security was cumbersome. Each system had its own security configurations, increasing the risk of data breaches or unauthorized access.
Manual Data Governance: Data lineage, cataloging, and governance were often manual processes or spread across different tools. This made it hard to track how data was used, where it originated, and whether it complied with regulations like GDPR or HIPAA.
7. Scalability and Maintenance
Infrastructure Constraints: Traditional data platforms were often on-premises, meaning scalability was limited by physical hardware. Scaling up required significant investment in infrastructure, making it difficult for organizations to quickly adapt to growing data needs.
Complex Maintenance: Maintaining the systems-updating software, ensuring hardware capacity, optimizing performance-required dedicated IT teams and added to operational costs.
8. Real-Time Processing
Lack of Real-Time Capabilities: In traditional systems, real-time data processing was rare or difficult to achieve. Most organizations relied on batch processing, which meant reports were often based on outdated data.
Streaming Data Complexity: Handling real-time or streaming data required specialized tools like Apache Kafka or Azure Stream Analytics, adding another layer of complexity to the architecture.
Переглядів: 4
Відео
Microsoft Fabric | The Future of Data Analytics | DP-600 | DP600
Переглядів 392 місяці тому
Microsoft Fabric | The Future of Data Analytics | DP-600 | DP600 Infoloomia
Salesforce Admin Complete Training | Salesforce Administration | Salesforce Admin Tutorial| Best CRM
Переглядів 47 тис.2 роки тому
This is complete Salesforce Admin Training material where we will cover all| Salesforce Administration topics (Salesforce Admin Tutorial). This is Best CRM in industry. Here you can learn - How to become a Salesforce Administrator TimeLine 00:00:00 Salesforce Introduction 00:00:00 Getting Prepared 00:04:57 Salesforce Clouds 00:08:59 Salesforce Editions 00:11:48 Planning the Transition 00:15:38 ...
DP900 | DP-900 | DP 900 | Azure Data Fundamentals | How to Pass DP900 | DP900 Certification/Exam
Переглядів 3792 роки тому
DP900 | DP-900 | DP 900 | Azure Data Fundamentals | How to Pass DP900 | DP900 Certification/Exam In this Video we will be learning about Microsoft Azure Data Fundamentals Certification | DP-900 | How to Easily Pass DP-900 Exam dp900,dp-900,DP 900,How to Pass DP900,DP900 Certification,microsoft azure data, DP900 Complete Course Content, DP 900 Complete Course Content, DP-900 Complete Course Cont...
Top 10 Certifications For 2022 | Top 10 IT Skills | Best IT Certification | Most Demanding IT JOB
Переглядів 1792 роки тому
Top 10 Certifications For 2022 Highest Paying Jobs Best IT Certification Top 10 it skills in-demand for 2022, Top 10 it skills in-demand for 2022 in india, Top 10 it skills in 2022, Top 10 it skills in demand, Top 10 it skills for future Most Demanding IT JOBs Highest Paying Certifications Best IT Certifications Best Certification Courses Top 10 Certifications Certifications with highest salary...
How to Install Python Windows| Python Installation| Latest Python Version| Python 3.10| Infoloomia
Переглядів 602 роки тому
How to Install Python on Windows | Python Installation | Latest Python Version | Python 3.10 | Infoloomia Here we are going to show How to Download and Install Python 3.10 on Windows 10. Setting up your Windows 10 System for Python is relatively easy. We Just need to follow some important steps to Install Python on Windows 10. This tutorial is For Beginners who wants to learn How to Install Pyt...
Files Handling in Python | Python Files Handling | How to Read files in Python | Python Files IO
Переглядів 1032 роки тому
Python is a more popular programming language that can be used for a wide variety of domains like data science, machine learning, big data analytics, web & app development and game development. What are files ? The file is a storage unit just like paper or documents in the real world. There are various types of files like data files, text files, program files, media files (images and videos) an...
Install SQL Server 2022 | How to download SQL Server | Install SQL Server | SQL Server Installation
Переглядів 9142 роки тому
This Video is created to show: Install SQL Server 2022, How to Install SQL Server, Install SQL Server, SQL Server Management Studio, SSMS, Download SQL Server 2022, Latest SQL Server Version, Latest SQL Version, SQL Server Installation SQL Server 2022 Preview builds on previous releases to grow SQL Server as a platform that gives you choices of development languages, data types, on-premises or ...
What is Azure Data Studio | Backup and Restore database using Azure Data Studio | Create Dashboard
Переглядів 3802 роки тому
What is Azure Data Studio How to Install Azure Data Studio Backup and Restore Database ADS Azure Data Studio is a free Microsoft desktop tool, initially called SQL Operations Studio, that can be used to manage SQL Server databases and cloud-based Azure SQL Database and Azure SQL Data Warehouse systems. The lightweight software is designed to make routine database development, querying and admin...
Power Pivot in Excel | Power Pivot Connections | Power Pivot Data Models | Power Pivot | Power Query
Переглядів 5642 роки тому
This course introduces Power Pivot, Power Query in Excel to any user who knows Excel and want to create reports with more complex and large data structures than a single table made by few thousand rows. Power Pivot is available in Excel 2010 onwards, and it is the center of Power BI, the Microsoft offer for self-service analytics. The course includes almost 2.45 hours of material. You do not ne...
Azure DP-203 Data Engineering | DP-203 Certification | DP203 | Azure DP203 | DP 203 | Azure Data
Переглядів 13 тис.2 роки тому
Azure Data Engineering, Azure DP 203, DP203, DP-203, Azure Data Engineer, Azure DP 203 certifications, Azure DP-203 Complete Course, Azure Data Engineering Training, Azure Data Factory, how to pass DP-203, DP-203 videos, How to prepare for Azure data Engineer certification, Azure Data Engineer videos, Azure certification videos In this video, everything is explained - You need to know about the...
What is EDI in eCommerce ? (Electronic Data Interchange) | What is EDI
Переглядів 1102 роки тому
Electronic data interchange tutorial, EDI, electronic data interchange,electronic data interchange in hindi,electronic data interchange example,electronic data interchange system,electronic data interchange in e commerce,what is the impact of ecommerce on electronic data interchange (edi)? ,basics of edi,benefits of edi, Infoloomia
BACK PAIN RELIEF | 10-15 MINUTES DAILY EXERCISE | BACKACHE | GET RID OF BACK PAIN
Переглядів 392 роки тому
BACK PAIN RELIEF | 10-15 MINUTES DAILY EXERCISE | BACKACHE | GET RID OF BACK PAIN
How to Pass Microsoft Azure DP203 Exam | Pass in the first attempt | Exam pattern | Topics in DP-203
Переглядів 3362 роки тому
How to Pass Microsoft Azure DP203 Exam | Pass in the first attempt | Exam pattern | Topics in DP-203
How to pass Microsoft Azure DP 203 Exam? | Pass in the first attempt | Exam pattern & much more
Переглядів 1 тис.2 роки тому
How to pass Microsoft Azure DP 203 Exam? | Pass in the first attempt | Exam pattern & much more
How to Create Tables in Microsoft SQL Server (SQL Server Management Studio) | SQL Server 2019
Переглядів 362 роки тому
How to Create Tables in Microsoft SQL Server (SQL Server Management Studio) | SQL Server 2019
Create Database in Microsoft SQL Server | Create SQL Server Database | Create User Database | SSMS
Переглядів 802 роки тому
Create Database in Microsoft SQL Server | Create SQL Server Database | Create User Database | SSMS
Install SQL Server 2019 | SQL Server Management Studio | SSMS | Installation SQL | Free Download
Переглядів 3812 роки тому
Install SQL Server 2019 | SQL Server Management Studio | SSMS | Installation SQL | Free Download
System Databases in SQL Server | Type of Databases in SQL Server | SQL Server 2019 | Master Database
Переглядів 3422 роки тому
System Databases in SQL Server | Type of Databases in SQL Server | SQL Server 2019 | Master Database
Data from SQL to Excel | Export Table from SQL to Excel | Export Query Result from SQL to Excel
Переглядів 762 роки тому
Data from SQL to Excel | Export Table from SQL to Excel | Export Query Result from SQL to Excel
SQL Server Restore Database | Download AdventureWorks | Restore Database AdventureWorks through SSMS
Переглядів 392 роки тому
SQL Server Restore Database | Download AdventureWorks | Restore Database AdventureWorks through SSMS
Views in SQL Server| Type of Views |Indexed Views| Partitioned Views| System Views| Updatable Views
Переглядів 5272 роки тому
Views in SQL Server| Type of Views |Indexed Views| Partitioned Views| System Views| Updatable Views
Temporary Tables in SQL Server | Difference between Local and Global Temorary Tables | Temp Tables
Переглядів 742 роки тому
Temporary Tables in SQL Server | Difference between Local and Global Temorary Tables | Temp Tables
Very helpful video..
what the hell s this tutorial?
Yikes. Audio and Video for me is way out of sync. Works well until we got into Leads section. Rest of the video is out of sync.
Hello, thank you for the all the content! Is amazing! I have a question though . If you post a comment on the lead page will the Lead see it or just your team? Because I didn't find on chatter configurations the visibility permissions. Thank you!
Audio is lagging. Has anyone faced same issue
Can I get code for Peoject
Hello thank you for this You were providing code in the description
Extremely helpful thank you, definitely referring to this in the future for issues i have
Nice video, thank you. Can i have the slide ?
You did not explain the consistency in cosmosdb properly
Video is nt clear, its blur
Question: is it possible to build a web platform for specialized equipment rental on Salesforce lighting? End users will be directed to a webpage where they can rent, sign contract and pay for rental. They can create a profile or use a temporary profile. Phase 1 is basically a web commerce website but I want to develop into a platform for rental, service etc. Is Salesforce Lightning good for this purpose or I should look at Bubble in. Many thanks
Hi dear! Thank you very much for this video that you put here for us it is a great video I have a question about how can I find jobs for admin can you please help me?
May i know is this wholesome making of salesforce admin ? If so with how many yrs of experience we can be equipped with...thank you.
This is enough content for administration with 4-5 yrs of experience
@@infoloomia i appreciate your efforts and kindness 🙏word thankyou wouldnt be enough dear.
Great source of information for learning Salesforce, although audio is not in parallel with video actions!
Close and try to rerun
@@infoloomia Still doesn't work.
I played the audio on my phone and a lagging video on my laptop with audio muted :D
This video is outstanding. Thank you so much. Hoping to take my exam in 2 weeks or so.
Best of luck!
How was your exam?
I took the admin exam today and passed! Your content was very good, and in tandem with my trailhead, Mike wheeler and a few other channels. Great video for reinforcing content required for learning. Awesome!!!!
Any tips aside from trailhead and focus on force exams?
Not much... just focus on each topic. Fucus should not be clear the exam. It should be like you enjoy the content and all concepts should be clear. Watch twice thrice and clear your mind with each topic.
Can you please help me, how can I pass the test?
@@thememegh5106 duplicate concepts in the app. Study what you need to pass and don’t go off into things above or lateral to the admin exam for instance don’t go to deep into app developer or consultant, stay in the breadth of admin material. Find good videos online that reinforce the expected material. Some test questions help with solidifying the material but not necessarily are like the exam. Finally consume your life in SF. Every moment of the day that you can, find an article, update or some material to listen or read to fill in any gaps. Basically, go to sleep with at videos, watch good sf videos, take a bunch of practice tests to LEARN how SF works. Go hard with automation. Good luck! Edit: can’t believe I forgot this but TRAILHEAD. I am currently a ranger and in a couple weeks I will start a new SF sprint of learning. I’m hoping to achieve double ranger by years end. Gamify and be excited to grow with trailhead and get better scores!
I feel like one of the characters from aqua teen hunger force grew up and became a salesforce guru.
Comprehensive! The voice is an interesting choice but narrated extremely well. Wtf thanks!
Thank you, i am new and i need this.
Welcome!
Thanks for this great full course and your clear explanation!
Glad it was helpful!
Awesome tutorial, thank you for taking time out to do this 😀
Glad it was helpful!
please provide the links in the description
Best tutorial on youtube by far. Thanks a bunch
Glad you think so!
Omg i cannot believe i found this video…this helps me a lot as i prepare for cert exam. i now have a better understanding of SF admin. thank you for this video!
Now this is giving back to the community
I have just subscribed and I will share this with friends
Is this enough to help me grasp the admin 201?
YES... JUST YOU HAVE TO EXPLORE EVERYTHING BY LOGIN SF
Just passed the exam with a score of 927/1000. Thanks for a great video. Keep up the good work.
Many congratulations 🎊
dude, what resources did you used can you please explain
kindly share the resources you used
@mannykhan7752 please share ur experience here
Hey there, Pleasured to meet you again. I am new sql learner with a simple question. How temporary table differs from physical or permanent tables? Does a temporary table exist only during the time of session or is it temporary in other terms or sense? THanks a lot.
Nice ❤️
Audio is moving bit late than video. Excellent content.
Thanks for your input, we will try our best next time
From where I can get the complete content
Go-to Description Box, Where I have already given the URL. You can Study from there and pass the exam. Good Luk :)
Good Informative Video 👌
Thanks Buddy🥳
Valuable information..
Thanks buddy 😘
Its a Good Video to learn Data Engineering.
❗ ≋p≋r≋o≋m≋o≋s≋m
SQL to Excel
Demo - Database Creation in SQL Server
List of System Databases post installation
Topics/Content and Exam Pattern in DP-203 Exam
A complete Guide about Viewes in MS SQL Server
Scripts Used in the Video: -- Drop local temp Table if exists drop table if exists #Temp; -- Create Temp table Create Table #Temp ( EmpID smallint, EmpName Varchar(20), ) -- Insert Data into Temp Table Insert into #Temp Select 1, 'Tim Erickson' Union all Select 2, 'David Boon' Union all Select 3, 'Laura Rin' -- Fetch Data from Temp Table Select * from #Temp --Alter temp table Alter Table #temp add DeptID smallint; --Update temp table Update #Temp Set DeptID=1 --Fetch Altered and Updated data from Temp Table Select * from #Temp Note: If, in the same table we will use ## instead of #, then that will be a global temp table and will be accessible to another session. This is the major difference in #temp V/s ##Temp
If anybody has any question about Temprary Tables in SQL Server, feel free to comment on this.