This video tutorial is now about 16 months old, but it's going to help me in pulling in the needed data from my UberEats Deliveries. I realized I needed to automate the data to help maximize my taxes, and the Power Query as a function to bring in multiple pages is really what I need. So thank you very much. I look forward to getting it implemented, so that I can link the Excel files in MS Access for further data analysis.
I am also REALLY happy as it helped me with the following use case. I have addresses I want to scrape off web. They are nicely in a table but there is hiearchy as such: umbrella company -> daugther companies -> branch addresses. I got url's from a sitemap and needed to load everything at once...and this helped me to do exactly that. Hopefully it helps someone else too!
Awesome Video! Seriously Days of searching and then you explained everything I needed in 12 minutes! To Fix "Expression.Error" - If you're pulling a lot of dates, or websites, and maybe some of them don't exist or don't exist yet you'll get this error and it'll mess up everything: Solution: @9:20 in the video simply right click your FetchMovie equivalent column header and select Remove Errors.
This is gold man, thank you, I was able to fetch currencies from last 7 years from xe site and they love me now. I was also able to get data from steamships lines and so, thank you so much Manz God bless you
This was my first fx table. Thank you so much. Please follow this video up with a 5min time intelligence table video if you want. I'm going to watch your other videos.
Hello. I had difficulty in self-study because there were not many Korean books that taught me about power query editing or parameters. Thanks to your video, I have integrated multiple APIs into a single query. I clicked Like and Subscribe. Have a nice day!!
Excellent video. I used it to download 150 tabs of info on the WSJ website with all public companies listed in USA and it worked like a charm. Thanks a lot.
I normally don't leave comments but just hit the "like" (or not) button. But this video honestly, is very well made and helped me a lot. Thank you Computergaga.
Wow excellent information by simple teaching ... Even a layman understand this module... Creating more interest to learn...Great Work...Non computer literate man from South India Coimbatore Tamil nadu... Slowly learning computers
This is so easy that it seems impossible. I've tried so many solutions from the internet that were not even close to what you've demonstrated. Huge credit! Thx!
I am grateful for this... I had dataset of 40,000+ rows in 82 pages and it worked perfectly. Thank you for sharing the knowledge. I'm gonna try this in power bi too and see if it works. Thanks again!
This is great, thank you! What if the URL doesn't have a unique identifier for the different pages? I'm trying to download data from more than 50 pages but every time I go to the next page and see a different spreadsheet, it has the same URL. There is no equivalent to the year numbers in your URL.
I searched all over the internet and finnally found your video. It is exactly what I need. I cant thank you enough for this well explained and details video. However when I try to create my own thing, I got issue when there is/are pages with no data, the queries stop working from that date. I hope you can help me on the issue. Many thanks in advance.
You guy are awesome, this video is what I needed, and completely blow my mind, I got quite confuse regarding URLs table and URL column, but I handle it and apply exactly as you on example that concern me, big THX.
This tutorial is so clear and easy to understand. I have some problem, the table result only show 10 data per row (because the website default is show 10 row, but we can change to show all) how can we fetch all data?
Hi, I know this video is a few years old, but hopefully you occasionally check in for questions/comments. Btw, thanks you for this tutorial, it has made my Excel life so much easier. I was trying to create a new worksheet using over 300 URLs, and I'm wondering if that is just too many because it always gets stuck about a third of the way through loading the rows. Any ideas why?
Thank you so much for such an incredibly informative, easy to follow and helpful tutorial. That was absolutely brilliant for helping me challenge my council tax band increase. Liked and Subcribed.
This is brilliant! I love how you go through the steps, seems like you're commenting a football match! :) Don't get me wrong it's really well explained! Thanks for that.
This was a great video. I think the main use case for this functionality is to automate web scraping of changeable data such as stocks, commodities, and prices on goods and services. Otherwise, one could simply cut and paste stagnant data.
This was so useful! The only problem I had was that when I began adding large amounts of data, I got the error: [Expression.Error] There weren't enough elements in the enumeration to complete the operation. What should I do?
FANTASTIC vid. Thank you!!! I’m scanning for Zip Codes and the address in one of the URLs isn’t valid and I get an error. How do I tell the function to ignor it and continue?
Brilliant! Excellent teaching skills; very clear and understandable. Would it be possible to do a video showing how to get data from a username & passworded website, please?
Not only this is incredibly useful, but is also wonderfully explained. Thanks man!
My pleasure. Thank you very much.
SAME HERE...BUT UR REACTION..WHILE DELETING URLs IN TABLE..!!!!
You, sir, ARE A LEGEND. Searched for ages to accomplish what this allows me and came up empty. When I landed on you, I was home free. Thank YOU!!!
Great to hear, Kurt. Happy to help.
This video tutorial is now about 16 months old, but it's going to help me in pulling in the needed data from my UberEats Deliveries. I realized I needed to automate the data to help maximize my taxes, and the Power Query as a function to bring in multiple pages is really what I need. So thank you very much. I look forward to getting it implemented, so that I can link the Excel files in MS Access for further data analysis.
Excellent! Thank you.
Searched for this all over . This has been the best example I've seen and very well explained. Thank you very much
Glad it was helpful! Thank you 😊
Become one of my top instructors and youtube channels with only one single video! thanks a million!
Thank you Isparoz.
I can't thank you enough for this detailed video (step by step). Its 3 years old but still valid and valuable. God Bless you!
Glad it helped! Thank you 💓
I can't thank you enough, Computergaga! My work project just became a breeze.
You're very welcome.
I am also REALLY happy as it helped me with the following use case.
I have addresses I want to scrape off web. They are nicely in a table but there is hiearchy as such: umbrella company -> daugther companies -> branch addresses. I got url's from a sitemap and needed to load everything at once...and this helped me to do exactly that. Hopefully it helps someone else too!
Awesome! Thank you for leaving that comment.
I never see videos with this many views that have 0 dislikes but after trying this solution, I understand why. Great video!
Thank you very much.
you make my date on time when looking for immediate resolution.Thanks
Awesome Video! Seriously Days of searching and then you explained everything I needed in 12 minutes!
To Fix "Expression.Error" - If you're pulling a lot of dates, or websites, and maybe some of them don't exist or don't exist yet you'll get this error and it'll mess up everything:
Solution: @9:20 in the video simply right click your FetchMovie equivalent column header and select Remove Errors.
Excellent Dusty.
Thank you! Thank you! Thank you!
This is gold man, thank you, I was able to fetch currencies from last 7 years from xe site and they love me now. I was also able to get data from steamships lines and so, thank you so much Manz God bless you
You're very welcome. Nice work Kenneth.
@@Computergaga Can you show us how to do the same thing but with a PDF file? meaning the source would be the folder where the pdf's are located
This was my first fx table. Thank you so much. Please follow this video up with a 5min time intelligence table video if you want. I'm going to watch your other videos.
Dude these steps still works until now and easy to understand. Thanks man!
Glad it helped!
This Video is still the best when it comes to Excel Web Scraping! For me it was a breakthrough. Many thanks!
You're very welcome! Thank you for your comments Johann.
Hello.
I had difficulty in self-study because there were not many Korean books that taught me about power query editing or parameters.
Thanks to your video, I have integrated multiple APIs into a single query.
I clicked Like and Subscribe.
Have a nice day!!
I'm overwhelmed by the incredible level of clarity this tutorial displays! Kudos
Wow, thanks!
I couldn't be more grateful to find this channel. Dear Computergaga, you explain everything so clear and understandable :) Real life saviour!
Thank you very much 😊 That's great!
You saved me several hours of manual operational work
Great! 👍
Excellent video. I used it to download 150 tabs of info on the WSJ website with all public companies listed in USA and it worked like a charm. Thanks a lot.
Awesome! Thanks for sharing.
thank you so much i was able to compile 28k records ...each page had only 100 records per page. was so useful and so well explained
Brilliant! Glad it helped!
I normally don't leave comments but just hit the "like" (or not) button. But this video honestly, is very well made and helped me a lot. Thank you Computergaga.
You're very welcome. Thank you for the comment Luigi.
Thank you very much, just spent hours trying to get this to work on other sites.
No worries, Paul. Very glad to help.
Thank you! Very useful and it changes the structure of my work a lot. More efficient.
Great to hear. Thank you.
Wow excellent information by simple teaching ... Even a layman understand this module... Creating more interest to learn...Great Work...Non computer literate man from South India Coimbatore Tamil nadu... Slowly learning computers
So nice of you. Thank you Niranjan.
UA-camrs should look to this to see how to do it properly.....excellent.
Thank you, Don.
This is so easy that it seems impossible. I've tried so many solutions from the internet that were not even close to what you've demonstrated. Huge credit! Thx!
You're welcome Szymon.
I am grateful for this... I had dataset of 40,000+ rows in 82 pages and it worked perfectly. Thank you for sharing the knowledge. I'm gonna try this in power bi too and see if it works. Thanks again!
How much time does it take to fetch the data
Cool tutorial! Thank you. Bombastic ! I have learned how to make a function out of a procedure!
Excellent! No worries, buddy.
Excellent & smart stuff, saving tons of work. Grateful for sharing !!!
Very welcome! Thank you.
Love it ! Thank you so much! With the new Web Connector, this became so much better!
Oh yes! 👍Thank you. You're welcome.
WOW. It works perfectly. It is exactly my need. Such a perfect example. You save me hours and hours of a boring task. Many thanks
Great to hear. Thank you 👍
This is great, thank you!
What if the URL doesn't have a unique identifier for the different pages? I'm trying to download data from more than 50 pages but every time I go to the next page and see a different spreadsheet, it has the same URL. There is no equivalent to the year numbers in your URL.
I searched all over the internet and finnally found your video. It is exactly what I need. I cant thank you enough for this well explained and details video. However when I try to create my own thing, I got issue when there is/are pages with no data, the queries stop working from that date. I hope you can help me on the issue. Many thanks in advance.
Thank you so very much...beautifully explained, Highly underrated video. Just Brilliant
Thank you 😊
Many thanks for your great help ❤️❤️❤️❤️❤️❤️❤️
You're welcome 😊
WOW!! I know this video is old.. But thank you for making it. I never would've figured this out on my own 😅
You're very welcome! Glad that it helped.
Awesome tutorial breaking down this complex process in easy to follow steps along with a good example. Thank you sir.
Worked like a charm!! No tutorial has ever worked for me in 1-go!! Thanks!
Happy to help, Gaurav. Excellent!
Thank you! Took me way too long to find this gem
You're welcome.
One of the best tutorials. Does the job in matter of minutes.
Thank you, Chirag.
This is super usefull, I wont remember it, but I will come back again when I will need it again
Thank you!
What a piece of knowledge...very useful for me..thank you very much
You are most welcome. Glad that it was useful.
Im not writing comments very often, but I have to say that this video is so useful, interesting yet well eyplained, that I had to.
Thank you 👍 I appreciate your comments, very much.
Thanks, solved a data input I was having.
Very well explained.
Thank you. Glad to be able to help.
Thank you so much! You really saved me here (And taught me how to use a great and powerful tool)!
Awesome! Good to hear.
This was a huge help! Needed to download over 5000 rows of data. You saved me so much time. Thank you!
Awesome to hear. Happy to help Brock.
Easily one of the best videos on UA-cam
Thank you very much 😊
One of the best tutorials I've ever seen! Thanks a lot!
You're welcome. Thank you.
Thank u so much from VietNam
You're very welcome.
You guy are awesome, this video is what I needed, and completely blow my mind, I got quite confuse regarding URLs table and URL column, but I handle it and apply exactly as you on example that concern me, big THX.
Perfectly explained! Why to dislike this !?
Thank you Adr 😀
I have never seen the ratio between like and dislike in any video. Well done sir
Thank you so much Afzal 😀
Very clearly explained. Very good presentation.
Thank you 🙂
I wish to thank you so so so much for your video that helped me A LOT.
You're welcome, Antoine. Thank you.
Thanks, you helped me finish important part of my Business Intelligence Internship Project
Happy to help 👍
Life saver.... Can't thank you enough... Been using this for months....
You're welcome, Serkan 👍
Mahalo! You solved my whole problem just a few minutes into looking into how this works.
👍
Many thanks, Mr Computergaga - awesome!
Thank you so much, Ian.
I was desparately in need of this and your video incrediblely came out to me. Besides your presentation is perfect... thank you
You're very welcome 😊
Very helpful video! Useful technique, and explained in an easy to understand way.
Thank you, Bill.
You have a great teaching style
Thank you, Viktor.
This tutorial is so clear and easy to understand. I have some problem, the table result only show 10 data per row (because the website default is show 10 row, but we can change to show all) how can we fetch all data?
Thanks, Alan. You are awesome!
Thank you 😊 you're welcome
This is incredibly helpful and useful. Really helped me solve a 2 year headache. Thank you so much. Appreciate the detailed guide.
You're very welcome!
Thanks so much for your excellent tutorial!!! It really helped get the job done!!! 👍👍👍
Hitting Subscribe by watching your 1st Video.... ultimate.. The way you explain..
👍
Hi can only say thank you !! you're an amazing teacher. Cheers
Thank you! 😃
This is an excellent video. It helped me overcome a major hurdle today. Thank You!
Great to hear. Thank you, James 😊
This is great tutorial and very clear explanation
Thank you very much.
Hi, I know this video is a few years old, but hopefully you occasionally check in for questions/comments. Btw, thanks you for this tutorial, it has made my Excel life so much easier.
I was trying to create a new worksheet using over 300 URLs, and I'm wondering if that is just too many because it always gets stuck about a third of the way through loading the rows.
Any ideas why?
Men should think machines should work you have done a job excellent hats off
Thank you Narayan.
Thanks. This was very useful. And ditto for all the below positive comments.
Thank you 👍
This is an excellent tutorial - clear and easy to follow, thank you! It's saved me a ton of time.
You're welcome, Matthew. Thank you.
Thank you for this video.. Saved me so much time
Great to hear! Thank you.
Good one Bro I got some manual work automated now...thanks verymuch
Great 👍
You are a super Hero, Thank you so very much.
You're welcome! Thank you 😊
Great video and great explanation !!! Congratulations
Thank you, Xavier.
Brilliant solution, these are great.
Thank you, Jonny. Happy to help.
Helped me a lot mate. Thanks a lot.
No problem 👍
This was a great video. Thanks so much for the tutorial
You are so welcome!
Excellent !!!! This is what I was looking for. Very well explained step by step.
Thank you, Dhananjay
This tutorial is pure Gold! Thanks! Looking forward to more videos from you. :)
Thank you very much Subanta.
Thank you so much for such an incredibly informative, easy to follow and helpful tutorial. That was absolutely brilliant for helping me challenge my council tax band increase. Liked and Subcribed.
Glad it was helpful, Darren.
Incredible video. I have been trying to learn how to do this for ages now, and every other video I've seen isn't near as helpful.
Great! Happy to help, Nathan.
This is brilliant! I love how you go through the steps, seems like you're commenting a football match! :) Don't get me wrong it's really well explained! Thanks for that.
Thanks! 😃
Fantabulous.......Great explanation & worked at the first go....Thanks a ton
Most welcome, Hari 👍 Thank you.
You have saved me a huge amount of time - thank you so much!
You're very welcome.
I feel like crying this is amazing
Awesome! Nice crying I hope. Thank you.
AMAZING...WOOOOOOW.... THANKS SO MUCH SIR
sir, i need more columns in excel more than 1 decillion. any tips? this may sound crazy, but I do need it
You're welcome. Thank you.
Look into connecting to an external source using Power Query. Excel cannot provide more rows or columns on the sheet.
That is sick!;) I saw nobody adding some text to PowerBi code before. It looks very advanced and powerful. I really love it. It opens eyes;)
Thank you, Enrike 😊
Great stuff! thanks a lot mate. made my work a lot easier :)
No problem 👍
Thank you Computergaga. This was a superb explanation and tutorial.
You are welcome, Robert.
Thankyou sir, you are a precious gem😍
You're welcome 😊
This was a great video. I think the main use case for this functionality is to automate web scraping of changeable data such as stocks, commodities, and prices on goods and services. Otherwise, one could simply cut and paste stagnant data.
Thank you.
This was so useful! The only problem I had was that when I began adding large amounts of data, I got the error: [Expression.Error] There weren't enough elements in the enumeration to complete the operation. What should I do?
really useful, thanks for your help. nice explanation
You are welcome, Maurizio.
FANTASTIC vid. Thank you!!!
I’m scanning for Zip Codes and the address in one of the URLs isn’t valid and I get an error. How do I tell the function to ignor it and continue?
Brilliant! Excellent teaching skills; very clear and understandable.
Would it be possible to do a video showing how to get data from a username & passworded website, please?
ohh!!! simple and powerful!
👍