Wow! You have changed my understanding of Excel capabilities!!! I used to use MS Access to do what you did with Power Query in seconds. I wonder if Access has much use still! Brilliant video: super clear, concise, well produced and useful! Thank you!
OMG, I am sitting here Wide-Eyed and Overwhelmed with Joy...Thank you so much for this.....I have been racking my brain on how to reduce and control our data to a professional and organized manner!...AMAZING!!!
Amazing!! Working up a prototype model in MS Access but wanted to report it in excel with pivots and slicers. Was ready to give up and hand it over to the BI developers but tried your method. 6m rows and extremely responsive. Also used your whole number trick and file size only 22mb. Thanks heaps!
forgot about the decimal trick. That was awesome . So just by trimming the decimals to rounded numbers , you can reduce your file size. I am going to try that at work
The 3 tips in Excel video took me here. Thank you so much for the short but very informative tutorial video. I was trying to compare almost 9,000 rows with a couple hundreds rows, and Excel kept saying “Not responding.” I used the transpose function which is very very helpful without knowing it’s Power Query.
Thank you so much Wyn for this video. I was able to take a 132 MB report and consolidate it down to just 9 MB! Now everyone in my company who does not have Excel 365 can view it as well as slice and dice.
Really nice intro to Data Model and why I should probably start using it. My base item set is 150k entries, and there are various dimension tables which can be attached, the numbers get big quickly and generic Excel M.O. starts choking immediately when you throw index(match()) at all of that. Thank you! P.S. it took me far longer to understand those manager names than I would like to admit. :p
Ohmygosh, that, my friend, was simply amazing!! 👏👏👏👏 I can't explain to you how many files I have that are so large that they respond slowly and then eventually crash! I've been trying to save these large files as .xlsb, and it helps, but it isn't fixing the ultimate problem. I cannot wait to try out this technique to see how it affects my monthly reports. Thank you so much for taking the time to go through this exercise. I am officially a subscriber!
Wyn, your content is always helpful and timely. Can you comment here about how to best use the same dataset but doing the analysis in Power BI? Since you don't have a Table, what is the connector with PBI?
Awesome video!!!! How much RAM does your PC have to handle this? It seems very snappy and fast in the Pivot Tables. I have 32 GB of RAM and im wondering if that is good enough
Hi, thanks for this. Normally in a traditional pivot if we click total column, it will typically open another sheet containing the break-up of that total column. How this will work with power query and data model?
I love your video and was able to use every part of it, however, I do not understand how you incorporated the additional tables (cost centers, etc). I get the ability to build the references but how did you get them into the mix?
Thanks for this excellent video. Could you please tell me how you added calendar, location table and cost center table on Queries and Connections section at the beginning of the video? Thanks in Advance
This is great and very helpful! Thank you for making this. May I know what PC specification that you used? I want to buy a laptop that can be used to analyze big data.
Here’s a video on the Cslendar table. ua-cam.com/video/LfKm3ATibpE/v-deo.html The other 2 tables I just created manually in excel and pulled in using Power Query
Check out this video. ua-cam.com/video/op6f-3uUFYg/v-deo.html If you’re using Power BI you can also use bravo.bi ( it’s free ) I’ve a video showing bravo.bi use here at 9:33 ua-cam.com/video/g4oZ0pOpn-4/v-deo.html
Great video! Do you have a video that show how to edit the data in massive data sets of up to 10 million rows? I've tried in vain to do it with a million rows using IF formulae (i.e if a record has "x" add the value "y" to field) but excel just gives up under the sheer weight of the formula...
Thanks, No video, but it really depends on how many columns you have, what the data source is and how much RAM you have, and if you have Excel 64 Bit. I just tested on 5 million from 10 CSV files in a folder, adding an IF Column value begins with A grab column B else 0. It ran in 30 seconds. 32GB RAM, 64 Bit Excel, 5 columns of data.
How do you select data from a huge .csv file to only get rows for the current year (there is a field for that)? I've not discovered how I can do that with Data Model or Power Pivot. Additionally, if you select a second year, can we separate it into two different sheets?
In power query you can apply a filter to date column and choose IS Current Year. Yes you could reference the main query two times, and filter the 2 new queries by the 2 different years and load to 2 different sheets
i just came across this video to understand the capabilities of power query, which has widened my horizon. can I get a download link to the other data for the connections you created before the one of 10 million rows? i want to use that to understand it perfectly if you don't mind.
I don't have the precise ones but I have extra data sets including the well known AdventureWorks one here under "Dummy Data Sets" accessanalytic.com.au/free-excel-stuff/free-excel-templates/
This was excellent! One question please. I have many csv files like in this video, but their columns structure is only slightly different from each other. Some of them have a few extra columns that I don't need. Is it possible to tell the query which columns to take (I can only include the columns that I know exist in all fyles)? Thank you!
As long as the first file in your folder has all of the columns you need then it won’t matter that other files have extra columns you don’t need. Make sure you use the Choose Columns or Right-Click Remove other columns option
I loved this video. Incredible work. I have a very large CSV file with 1.2 million rows and approx 300 columns. I want to extract only few particular columns from this data. How can it be done? Many thanks in advance.
Thanks Amit, that should be quite straightforward, use Get Data > From File > From Text/CSV connect to the file, then Ctrl Click on the columns to keep and then right-click REMOVE OTHER COLUMNS, then close and load to Data Model
Hi thanks for your reply. Is it a good idea to select columns manually if I have around 300 columns. In that case I have to do a lot of scrolling. Any idea if I can extract those desirable column headers name from some other excel or CSV file.. and then keep only these columns in my original large CSV file. Thanks.
Hi @@amitchaudhary6, yes it's possible but involves writing some M code. However, the Choose Columns button makes it really easy to tick the columns you want to keep / untick the ones you want to remove.
How do I get it to use the Exact formula so that it consolidates against a unique code from two sets of databases by also recognising lower case and upper case differences. For example AbC and ABC need to be treated as two unique codes.
Thanks for your video! What if my data size is far more than that and i need to update it with new data everyday? If I simple refresh the power query, it took me an hour loading time. Is it possible that the power query only add the new data to existing data and do not refresh all of them? Millions thanks!
So I have a table in MS Access that is close to 4 million records, that I would love exported into excel. Unfortunately excel would only allow a little over 1 million records. Would this work in that instance?
Check out this video ua-cam.com/video/RV47yX70NN8/v-deo.html Essentially click the Get Data button, connect to your other files and Load to... Connection Only... Data Model
Awesome, I would to ask you a question, if I create the model and everything in Excel 64 bits, my coworkers with Excel 32 bits will have any problem???. Right now with Excel 32 bits an a model I’m having a problem that consume More than 2 GB of RAM, in total with all the tables no more than 50k rows. Thanks
I wouldn’t rely on it working well on 32 Bit, but it could be ok if they are just slicing and dicing. Newer versions of 32 Bit Excel can now utilise 4GB Ram. 64 bit can utilise all the spare RAM on your machine
@@AccessAnalytic thank you very much for your answer so quickly, it seems I can ask to IT department to change my version to 64 bits but I was a little worried about that and about the macros I developed. Like you wrote, the people will only slice and see the data. It’s important to work on both arquitecture for this particular tool. Again thanks.
Need Help: My all CSV data is in same format..in raw header there is unique I'd and in coloum header there is Date(7 days date) and in field are there is different kpi. After load all data I need to add Average and countif >x function at last. I try for average..first take sum of 7 date and divide by 7. got the result but when I use power pivot and use this average data then only count shows error occurred sum is not working on this data.
Hey! I need your help. I am trying to convert a large XML file into Excel unfortunately whenever I try to convert it I am only getting incomplete data in excel form. e.g Emails are there but the name is missing or vice-versa. Could you please help me out. Thanks in advance
With 64Bit Office and enough RAM on your computer there is no limit The more columns you load and the more unique the rows are then the more computing power is needed
Hi, you can get a Calendar and other datasets here accessanalytic.com.au/free-excel-stuff/free-excel-templates/ I don't have the Location and CostCenter tables available sorry.
Is there a way to make the loading of data faster? I am currently handling like 15 millions of data, every week I'm adding like millions of data as well until the end of the year. I'm moving to PowerBI because of this issue, but I wonder if there is a trick on this.
Sounds like you need to move to storing your data in a SQL database. A shorter term alternative if you have Power BI is to use Dataflows this video of mine may be useful ua-cam.com/video/g4oZ0pOpn-4/v-deo.html
You could publish the entire file to Power BI , or put the Power Query code into Power BI desktop and export using DAX studio Or copy the M code to a Dataflow There’s no direct way to export power query code currently
That was on my 16GB RAM 64 but office. This one with 24 million rows was 32GB core i7 www.linkedin.com/posts/wynhopkins_excel-datamodel-activity-6902207760197390336-n2Lq
Lot's of reasons, different column headings, errors in data. You need to work your way through the Powe Query applied steps. When consolidating multiple files it is difficult to working out which file has the issue.
hey after i press okay to add connection only it gives an error after a while that The refresh operation failed because the source data base or the table does not exist, or because you do not have access to the source More Details: OLE DB or ODBC error: The connection could not be refreshed. This problem could occur if the connection was copied from another workbook or the workbook was created in a newer version of Excel.. An error occurred while processing the 'Actuals' table. The current operation was cancelled because another operation in the transaction failed.
Hi @Haasan Tariq, I think I've seen that error when the source data (Excel file) has N/A# or REF# type errors in it. Are you connecting to an Excel file on your network?
Wow! You have changed my understanding of Excel capabilities!!!
I used to use MS Access to do what you did with Power Query in seconds. I wonder if Access has much use still!
Brilliant video: super clear, concise, well produced and useful!
Thank you!
Thank you for the very kind comments. Access is indeed being slowly made obsolete by several other Microsoft technologies
OMG, I am sitting here Wide-Eyed and Overwhelmed with Joy...Thank you so much for this.....I have been racking my brain on how to reduce and control our data to a professional and organized manner!...AMAZING!!!
Great to know
SUBSCRIBED AND FOLLOWING
You took the words out of me! 😁 following too.
Thanks @@natah1284
I've seen this a few times before, but no one has ever drilled down into the performance ROI... thank you, this is going to help a lot!
You’re welcome
I’m currently working on +11k rows of data and currently experiencing challenges. Glad I came across your YT channel 😃
Good to know, thanks for the kind comments
This is a powerful example from the Big Data era.
Amazing video! Thank you Wyn!
Glad you liked it Iván
I can't say how much I am thankful for this video,i struggled for two days and everything I tried end with not responding, you save me thank youuuuuu🤩
Glad it helped
Amazing!! Working up a prototype model in MS Access but wanted to report it in excel with pivots and slicers. Was ready to give up and hand it over to the BI developers but tried your method. 6m rows and extremely responsive. Also used your whole number trick and file size only 22mb. Thanks heaps!
Awesome, I really appreciate you letting me know you found it useful
forgot about the decimal trick. That was awesome . So just by trimming the decimals to rounded numbers , you can reduce your file size. I am going to try that at work
Excellent
The 3 tips in Excel video took me here. Thank you so much for the short but very informative tutorial video. I was trying to compare almost 9,000 rows with a couple hundreds rows, and Excel kept saying “Not responding.” I used the transpose function which is very very helpful without knowing it’s Power Query.
You’re welcome
Short and precise explanation. This is amazing and thank you for providing data set for practice.
Thank you Byregowda, kind of you to leave a comment
Thank you so much Wyn for this video. I was able to take a 132 MB report and consolidate it down to just 9 MB! Now everyone in my company who does not have Excel 365 can view it as well as slice and dice.
Excellent to hear Phil. Thanks for letting me know this was useful.
@phil danley how did you consolidated it down to 9 MB??
The Power Pivot (Data Model) engine performs some amazing compression when columns contain non unique values
@@PEACOCKQ Wyn explained it better than I could. My answer would be "magic" LOL. I don't know how the compression works but it does.
Just amazing.. Your decimal rounding of to reduce space ... worked brilliantly for my data model. Data model pull from SSAS. Thank you so much.
Glad to help 😀
What a beautiful way of explaining how to handle big data. Loved this !!
Thank you Shiraj, glad you enjoyed it
Really nice intro to Data Model and why I should probably start using it. My base item set is 150k entries, and there are various dimension tables which can be attached, the numbers get big quickly and generic Excel M.O. starts choking immediately when you throw index(match()) at all of that. Thank you!
P.S. it took me far longer to understand those manager names than I would like to admit. :p
😄. Glad it helps Lee
This is fantastic. I feel like I have a new superpower. Great explanation video.
That’s great Phoebe! Thanks for letting me know.
Power Query, Data model, very efficient. It became much easier to work with such heavy data. Thank you so much!👍
Thanks for leaving a comment Luciano
Ohmygosh, that, my friend, was simply amazing!! 👏👏👏👏
I can't explain to you how many files I have that are so large that they respond slowly and then eventually crash! I've been trying to save these large files as .xlsb, and it helps, but it isn't fixing the ultimate problem. I cannot wait to try out this technique to see how it affects my monthly reports. Thank you so much for taking the time to go through this exercise. I am officially a subscriber!
That’s great, glad to help flag what Excel is capable of
You have no clue the amount of sleep I have lost trying to figure out how to control this amount of data
Very informative. I may be a little bit of topics but what software you use for making your video. Thank you!
Camtasia and a green screen
Thank you very much Access Analytic for sharing your knowledge in handling big data using ower query ..God bless
Thank you
Very nice, Wyn! Thank you.
Cheers Houston
Very well explained and easy to understand. Thank you.
Thank you
Wyn, your content is always helpful and timely. Can you comment here about how to best use the same dataset but doing the analysis in Power BI? Since you don't have a Table, what is the connector with PBI?
Thanks Bernie. I’m not quite sure I follow. You can do it the same way in Power BI by pulling in the folder of CSV files.
This is AMAZING!!!! Still in awe.
Thanks for the brilliant content.
You’re welcome
Thanks, lesson completed. Greetings from Costa Rica.
No worries 🇨🇷
Awesome video!!!! How much RAM does your PC have to handle this? It seems very snappy and fast in the Pivot Tables. I have 32 GB of RAM and im wondering if that is good enough
Thanks 😀. I think that was done on 16GB. I use 32GB these days
That is SUPER AWESOME! 😍 Thank you so much, Wyn!
No worries, you're welcome Dimitris
Hi, thanks for this. Normally in a traditional pivot if we click total column, it will typically open another sheet containing the break-up of that total column. How this will work with power query and data model?
It can work to some extent, it depends on a few different scenarios so may / may not work, and may limit how many records display
Well done. Thank you for the instruction.
Thanks
Thank you, as many say, short and precise. Perfect
Cheers Mark
Fantastic! Thank you! However I couldn't figure out where consolidation tables came from. Is there a video about that and a link to it???❓❓
You're welcome Atomic Blue Life
Excellent tutorial, thank you!
Cheers!
This is helped me out TREMENDOUSLY! Thank you!
That’s great Joe. Thanks for letting me know
I love your video and was able to use every part of it, however, I do not understand how you incorporated the additional tables (cost centers, etc). I get the ability to build the references but how did you get them into the mix?
Hi Robert, watch this explanation of Data Models and let me know if that helps ua-cam.com/video/RV47yX70NN8/v-deo.html
You saved me a lot of time. Wonderful explanation.
You’re welcome Hussein
Thanks for this excellent video. Could you please tell me how you added calendar, location table and cost center table on Queries and Connections section at the beginning of the video? Thanks in Advance
Hi Debarshi,
Those were in an Excel file and I used Get Data from Excel Workbook.
Great job! Incredibly powerful!
Cheers
Wynn... this was amazing.. i completely forgot the 3D maps... great video!
Thanks for taking the time to leave a kind comment, I appreciate it,
One of your best videos 😊
Thank you
Pretty awesome you are right!! Thank you for your videos are so easy to follow!
Thank you so much
This is great and very helpful! Thank you for making this. May I know what PC specification that you used? I want to buy a laptop that can be used to analyze big data.
Glad it helps Ivena. I'm using an XPS 17 9700 with 32 GB RAM and i7 processesor
Tremendous. Thanks a lot for knowledge sharing.
Thank you so much Raghavendra
This was mega! Thanks for sharing your knowledge mate! Appreciate it.
No worries, thanks for taking the time to leave a kind comment
No worries, thanks for taking the time to leave a kind comment
No worries, thanks for taking the time to leave a kind comment
No worries, thanks for taking the time to leave a kind comment
is there a video on how you created the calendar, costcenter, and location table?
Here’s a video on the Cslendar table. ua-cam.com/video/LfKm3ATibpE/v-deo.html
The other 2 tables I just created manually in excel and pulled in using Power Query
You saved my life 🙏
Glad to help :)
I am grateful of you
Thanks for taking the time to leave a kind comment
How can I export the table created by Power Query to one CSV file?
Check out this video. ua-cam.com/video/op6f-3uUFYg/v-deo.html
If you’re using Power BI you can also use bravo.bi ( it’s free )
I’ve a video showing bravo.bi use here at 9:33 ua-cam.com/video/g4oZ0pOpn-4/v-deo.html
Great tutorial! Thank you
You're welcome Salvador
Great video! Do you have a video that show how to edit the data in massive data sets of up to 10 million rows? I've tried in vain to do it with a million rows using IF formulae (i.e if a record has "x" add the value "y" to field) but excel just gives up under the sheer weight of the formula...
Thanks, No video, but it really depends on how many columns you have, what the data source is and how much RAM you have, and if you have Excel 64 Bit. I just tested on 5 million from 10 CSV files in a folder, adding an IF Column value begins with A grab column B else 0. It ran in 30 seconds.
32GB RAM, 64 Bit Excel, 5 columns of data.
Great video. Thank you 🎉
You’re welcome
Thank you for producing this excellent video - very easy to follow and very informative
You’re welcome Tim
How do you select data from a huge .csv file to only get rows for the current year (there is a field for that)? I've not discovered how I can do that with Data Model or Power Pivot. Additionally, if you select a second year, can we separate it into two different sheets?
In power query you can apply a filter to date column and choose IS Current Year.
Yes you could reference the main query two times, and filter the 2 new queries by the 2 different years and load to 2 different sheets
i just came across this video to understand the capabilities of power query, which has widened my horizon. can I get a download link to the other data for the connections you created before the one of 10 million rows? i want to use that to understand it perfectly if you don't mind.
I don't have the precise ones but I have extra data sets including the well known AdventureWorks one here under "Dummy Data Sets" accessanalytic.com.au/free-excel-stuff/free-excel-templates/
This was excellent! One question please. I have many csv files like in this video, but their columns structure is only slightly different from each other. Some of them have a few extra columns that I don't need. Is it possible to tell the query which columns to take (I can only include the columns that I know exist in all fyles)? Thank you!
As long as the first file in your folder has all of the columns you need then it won’t matter that other files have extra columns you don’t need.
Make sure you use the Choose Columns or Right-Click Remove other columns option
@@AccessAnalytic Thank you
I loved this video. Incredible work. I have a very large CSV file with 1.2 million rows and approx 300 columns. I want to extract only few particular columns from this data. How can it be done? Many thanks in advance.
Thanks Amit, that should be quite straightforward, use Get Data > From File > From Text/CSV connect to the file, then Ctrl Click on the columns to keep and then right-click REMOVE OTHER COLUMNS, then close and load to Data Model
Hi thanks for your reply. Is it a good idea to select columns manually if I have around 300 columns. In that case I have to do a lot of scrolling. Any idea if I can extract those desirable column headers name from some other excel or CSV file.. and then keep only these columns in my original large CSV file. Thanks.
Hi @@amitchaudhary6, yes it's possible but involves writing some M code. However, the Choose Columns button makes it really easy to tick the columns you want to keep / untick the ones you want to remove.
Thanks for your inputs. It is exactly what I was looking for. 👍
300 columns?!
How do I get it to use the Exact formula so that it consolidates against a unique code from two sets of databases by also recognising lower case and upper case differences. For example AbC and ABC need to be treated as two unique codes.
I don’t know sorry. The DAX engine encodes ABC and abc the same
this looks super cool, thank you very much :)
You’re welcome
Good Video. Very helpful and now ready to try it.
Thanks for leaving a comment Ed. Good luck with it!
Are the calendar, location, and cost center tables available for download?
The calendar and various datasets are available here: accessanalytic.com.au/free-excel-stuff/free-excel-templates/
Also check out pbi.guide/resources/
@@AccessAnalytic thanks very much
Fantastic, thank you for sharing your knowledge
You’re welcome Ed, thanks fir leaving a comment
Awesome! Thanks for your sharing.
No worries
Amazing stuff
Thank you
Is this possible in the older version of Excel?
Excel 2016 onwards
Thanks a ton! Got to see a practical Big Data example handled through Excel and it's amazing! 😊👍👌
Thanks Vijay
Thanks for your video! What if my data size is far more than that and i need to update it with new data everyday? If I simple refresh the power query, it took me an hour loading time. Is it possible that the power query only add the new data to existing data and do not refresh all of them? Millions thanks!
Hi Warren, there's no way to do that with Excel currently. What is your data source?
So I have a table in MS Access that is close to 4 million records, that I would love exported into excel. Unfortunately excel would only allow a little over 1 million records. Would this work in that instance?
You could certainly pull the data into the Excel data model and then create Pivot tables to analyse it
Hi Teacher, do all csv files contain hearders or only the first csv file should contain headers?
All contain same headers
Thanks for the video
You’re welcome
i am having only the consolidation file...how other files can be inserted in this diagram view?
Check out this video ua-cam.com/video/RV47yX70NN8/v-deo.html Essentially click the Get Data button, connect to your other files and Load to... Connection Only... Data Model
Please make more videos on excel
Here's 28 others 😁
ua-cam.com/play/PLlHDyf8d156Xnoph4CbOiMrqQKiJZ8mhn.html.
If you want to export the 10 MM rows as a csv after transforming the data, is there a way to do it?
Yep Export Power Query Tables to CSV using DAX Studio. Even 5 Million records!
ua-cam.com/video/op6f-3uUFYg/v-deo.html
please, DR. can u explain how to search string in big data before?! thank u. you are a great teacher!
I don't understand could you provide more details please.
Awesome, I would to ask you a question, if I create the model and everything in Excel 64 bits, my coworkers with Excel 32 bits will have any problem???. Right now with Excel 32 bits an a model I’m having a problem that consume More than 2 GB of RAM, in total with all the tables no more than 50k rows. Thanks
I wouldn’t rely on it working well on 32 Bit, but it could be ok if they are just slicing and dicing.
Newer versions of 32 Bit Excel can now utilise 4GB Ram. 64 bit can utilise all the spare RAM on your machine
@@AccessAnalytic thank you very much for your answer so quickly, it seems I can ask to IT department to change my version to 64 bits but I was a little worried about that and about the macros I developed. Like you wrote, the people will only slice and see the data. It’s important to work on both arquitecture for this particular tool. Again thanks.
The only real way of knowing is to test it out
@@AccessAnalytic I will do it and I’ll let you know. Thanks
This is great! Thanks 😊
You’re welcome
Need Help: My all CSV data is in same format..in raw header there is unique I'd and in coloum header there is Date(7 days date) and in field are there is different kpi.
After load all data I need to add Average and countif >x function at last.
I try for average..first take sum of 7 date and divide by 7. got the result but when I use power pivot and use this average data then only count shows error occurred sum is not working on this data.
I’d recommend posting screenshots and a sample file here aka.ms/excelcommunity
Awesome, awesome, Thank You Very Much :)
You're welcome Elfrid
Great, how to make thé date table ?
Here you go What is a Date Table / Calendar table in Power BI / Excel
ua-cam.com/video/LfKm3ATibpE/v-deo.html
Hey! I need your help. I am trying to convert a large XML file into Excel unfortunately whenever I try to convert it I am only getting incomplete data in excel form. e.g Emails are there but the name is missing or vice-versa. Could you please help me out.
Thanks in advance
I’d recommend posting the issue with screenshots and sample data here aka.ms/excelcommunity
May I ask your computer spec.(cpu, ram) to handle above 250mb excel files?
Currently using Dell XPS 32GB RAM i7-10875H CPU @ 2.30GHz (that video was done with Surface Book 16 GB RAM)
Hi, when I click on 3D Maps, and drag the city/county/date over. Nothing shows up. Why?
I don’t know sorry. Country first then city second?
Really awesome !!
Is there an easy way to extract data in an Excel data model into a CSV? Thank you!
How about this? ua-cam.com/video/op6f-3uUFYg/v-deo.html
Can i do the same to combine upto 50millions of rows?
With 64Bit Office and enough RAM on your computer there is no limit
The more columns you load and the more unique the rows are then the more computing power is needed
How can I get the data for Carlendar, Location and CostCenter?
Hi, you can get a Calendar and other datasets here accessanalytic.com.au/free-excel-stuff/free-excel-templates/
I don't have the Location and CostCenter tables available sorry.
Eish!?!? Excellent presentation, goodness!
Thanks ( I think ) Willy 😀
does it have to be CSV? I have a large excel sheet that I need to do this for.
It can be Excel. Refresh wil take a little longer
That was awesome.
Thanks :)
Great !!! thank you
You’re welcome
Is there a way to make the loading of data faster? I am currently handling like 15 millions of data, every week I'm adding like millions of data as well until the end of the year. I'm moving to PowerBI because of this issue, but I wonder if there is a trick on this.
Sounds like you need to move to storing your data in a SQL database. A shorter term alternative if you have Power BI is to use Dataflows
this video of mine may be useful
ua-cam.com/video/g4oZ0pOpn-4/v-deo.html
If I use Mac, can I do this from my Excel in Mac?
I don’t believe Power Pivot exists in Mac. Also Power Query is limited
Hi, thanks to how can export this data file to Power BI or any other tools?
You could publish the entire file to Power BI ,
or put the Power Query code into Power BI desktop and export using DAX studio
Or copy the M code to a Dataflow
There’s no direct way to export power query code currently
fantastic video thanks!
You're welcome Simon
What laptop / specs are you working with?
That was on my 16GB RAM 64 but office. This one with 24 million rows was 32GB core i7 www.linkedin.com/posts/wynhopkins_excel-datamodel-activity-6902207760197390336-n2Lq
MY PROBLEM is to increase rows limit of microsoft365 from 1048 576 to further 100 000 000. this video did not help me.any suggestions?
Not possible to increase the rows on the sheet grid. Why do you need that many rows in the grid?
Awesome...
Cheers 😃
❓Why do errors happen when loading? And what we can do with them?
Lot's of reasons, different column headings, errors in data. You need to work your way through the Powe Query applied steps. When consolidating multiple files it is difficult to working out which file has the issue.
So i was using mobile number when loaded it in data model and then pivot 10 didgit number were getting shorter to 5 digit
Is it loaded as Text data type in Power Query? Which field ( row/col/values) are you putting it in in Pivot
Awesome!
Thanks 😀
hey after i press okay to add connection only it gives an error after a while that The refresh operation failed because the source data base or the table does not exist, or because you do not have access to the source
More Details:
OLE DB or ODBC error: The connection could not be refreshed. This problem could occur if the connection was copied from another workbook or the workbook was created in a newer version of Excel..
An error occurred while processing the 'Actuals' table.
The current operation was cancelled because another operation in the transaction failed.
Hi @Haasan Tariq, I think I've seen that error when the source data (Excel file) has N/A# or REF# type errors in it. Are you connecting to an Excel file on your network?
Amazing!
Thanks....
You’re welcome