I HAVE NEVER SEEN ANYTHING LIKE THIS IN MY LIFE. Sure, I have see screen scraping back in the old CRT days, but this is UNREAL and it's easy ONCE you know the language!! Excellent Video!
I'm encountering an issue at 4:09 for the //p[@class='result-info'] , as I got a #N/A as a result. The class name on CraigList is not changed yet, so can't figure out why this isn't working as you. Thanks for your help and your videos. EDIT : #2 On TechCrunch website, I'm not able to click on "XPath" Button. It's not working at all. Otherswebsite are fine tho. Do you have any idea why ? #3 On Yelp website , the result for the first example in Sheets is CSS code. Far from what you get even if I'm doing the exact same thing. Your video isn't so old, I really can't figure out why thinks works so differently , I tried to re-watch many times your video to see if I'm missing something but no.... ;(
Thanks so much for the value! For your yelp example, how would you go about trying to keep a well managed and orderly scrape of all the items across all page numbers over time? Including trying to remove duplicates as each item moves across the different pages? Thanks!
Great video, I apply the importxml function to Google Sheets and other times it works and other times (without changing anything) it gives me #N/A into cell. What can I do? Thank you very much
Hello Thank you very very much for this excellent video that is very very helpful Just a question : if I need to scrap the image URL of the product, is there a way to do it ? Thank you
Great tutorial! When I scraped data from a website the data was only scraped until a certain point, even though more yellow containers were highlighted. What is the issue here, does the scraping stop after a certain number of lines?
Taking the Craig's List one for example, If you wanted to see the top 300 results, if they were beyond a more button that loaded onto the current page and not on a "page=2" type thing.
Hey Brad, unfortunately if you want to do any kind of UI interaction on the page, you'll need to use a different web scraping method--something like the Chrome web scraper extension or the Selenium library in R or python.
@@juanmaguevara That's very odd, I'm able to format my scraped columns and am trying to think of why it wouldn't work for you. Maybe the scraped text data contains non numeric values and Sheets is unable to format it? I'm not too sure
Maybe I'm too late, and maybe it's a dumb answer, but in some cases works for me adding 0 to the text to convert it into numbers, if the text is just numeric
Hey Thank you for the video! Do you know to get the updated data. Example if i am importing a stock price. and i would like to import the updated data after 30 mins.
I can't think of a way other than manually refreshing the formula and cells, however, I do know that Excel supports getting data from stock tickers. You can write a ticker name in a cell, like $AAPL, and then go to the 'data' to format it as a stock ticker, and then fetch a lot of different data points about the stock -- it might be easier than scraping it!
Is it possible to have google sheets pull information from Search Engine results? For example, enter a business name, and it searches Google and pulls info for that company?
Excellent video. Curious if you could help explain if this is exclusive for text or if numerical data can be extracted as well? If so, could you help coach on how to do that? I keep struggling to get anything but the text headers in a numerical data table that is non HTML tables. Thank you!
How would this work (if at all) in youtube trying to scrape video data? Especially when it comes to tracking down the actual video ID and not the vanity URL? THANKS IN ADVANCE
Clear Explanation 👍 Questions Will the important HTML Is up-to-date data from the source website? If no then please tell us a way to keep a live data 2- I want to scrap ecommerce website product data, how to auto scrap Next page ? 3- How about import data via json file url most e-commerce website uses it eg Shopify I'll be thankful if you please create an ecommerce website data scraping vidoe or share your tips so ill give it a try 🙂
Thank you so much for these tutorials! I think i'll use them in future. Not now, because.. i need to import comments from instagram, and...is there any way to do that? I guess insta won't let google sheet take data from it because it's not "logged in", and..yea.. i would love to hear any answers for it, even if that's a no :")
would love to hear how you would go ahead scraping dynamic pages that loads the content through java / api? I have some different solutions available: Scrapy, Octoparse, Selenium(Python), Java, or somehow retrieving it directly from the API. Could i do it with GraphQL? I need the data to get fed into a cell in a google sheets, i prefer not having to manually load it from a csv. i'm okay at sheets but not python/java
Which site are you trying to scrape? Websites where the data is loaded dynamically sometimes don't cooperate with Google Sheets / other webscrapers and you may need a different approach
Same problem... I'm trying to scrape youtube.com. I'm watched this video ua-cam.com/video/pwZ44kAeiOo/v-deo.html&t where he scrape youtube with no effort, but right know it's seems it no working any more...
@@dataslice Yes. Here is the element: Aldersgate United Methodist Church When using @href to import, it imports the hyperlink. Is there a way to import the anchor tag? Thanks
Do you know if it's possible to tell Google sheet to scrap data from a specific location? I tried using those commands, but it was sending me data from United States whilst the page update automatically depending on the country you're accessing it from, though the URL remains the same.
UA-cam dynamically generates content on the page with javascript -- in other words, the page essentially loads and is blank and then the content is populated after the fact. Google Sheets (and other static web scrapers) can only scrape the page if the content is there on the initial page request, but it's unfortunately not able to if the content is generated after.
Thanks for the awesome video! But how to find the right xpath from youtube? I try SelectGaghets extention, but he gives me a Error: Imported Xml content can not be parsed. Or Error "Imported content is empty" only "//a" xpath works for me...
SelectorGadget doesn't have an icon to click to activate after i Installed on chrome. Is there a Firefox equivalent? Also, how would you recommend scrapping home data from Redfin/Zillow? I would like to paste in links and automatically fill in home data row by row for different homes. For the SF for example, i tried used //div[@class='info-block sqft'] but it doesn't work (shows N/A)
I HAVE NEVER SEEN ANYTHING LIKE THIS IN MY LIFE. Sure, I have see screen scraping back in the old CRT days, but this is UNREAL and it's easy ONCE you know the language!! Excellent Video!
Thank you for this video! I was trying to scrape data from a website and couldn't figure it out until I came across this video.
That tutorial is so useful and simplicit, contain no bs and full of content. You are a champ.
Wow! You' re a MASTER of scraping and Google Sheets! Just learned so much with 2 of your videos
Thanks! I’m glad to hear it!
@@dataslice Thank is to you! Only problem is scraping Image URL from Craiglist in your example, I added /@src but it doesn't work
More google sheet tutorial please. Thanks a bunch! 😍
this channel is gold. Amazing tutorials
Thank you, I appreciate it!
is it possible to get the sector from google finance / yahoo finance page for a stock, i tried but its showing me an error..
Thanks, great help for any webside :D
You da real mvp
Thanks! :-)
best tutorial ive seen, thank you
Superb
finally I found it, absolutely amazing, thank you a lot!
U solved my very unsolved problem - thanks alot!!
I'm encountering an issue at 4:09 for the //p[@class='result-info'] , as I got a #N/A as a result.
The class name on CraigList is not changed yet, so can't figure out why this isn't working as you.
Thanks for your help and your videos.
EDIT :
#2 On TechCrunch website, I'm not able to click on "XPath" Button. It's not working at all. Otherswebsite are fine tho. Do you have any idea why ?
#3 On Yelp website , the result for the first example in Sheets is CSS code. Far from what you get even if I'm doing the exact same thing.
Your video isn't so old, I really can't figure out why thinks works so differently , I tried to re-watch many times your video to see if I'm missing something but no.... ;(
Excellent video. Great content!!
Simply Amazing. Thanks for such a wonderful video tutorials
wow, what an extension! Killer! Love it
This video is so good it's basically a cheatsheet.
Thanks so much for the value! For your yelp example, how would you go about trying to keep a well managed and orderly scrape of all the items across all page numbers over time? Including trying to remove duplicates as each item moves across the different pages?
Thanks!
Super useful. You safe my day!
top-notch tutorial, Thanks alot :D
Thanks man you really helped me out here!
thumbs up for the video, really useful and well explained.
Earned a subscriber great info, clear and concise!
Thank you!
Thank you for your help
Absolutely amazing !!!!
Great video, I apply the importxml function to Google Sheets and other times it works and other times (without changing anything) it gives me #N/A into cell. What can I do? Thank you very much
If nothing is changing, I’m not sure what the issue would be unless there’s an error getting data from the site. What site is it?
@@dataslice I did it through a script and it works. I was told that it was probably the speed of the network. Thanks a lot again
amazing..thanks
Great info, earned a sub 🙌
Thank you mr for these useful tricks
Thanks for watching!
Can i do the same on password protected site
Amazing content
If the page gets updated. The info on the Sheet will get updated as well?
Really good video :D
Hello
Thank you very very much for this excellent video that is very very helpful
Just a question : if I need to scrap the image URL of the product, is there a way to do it ?
Thank you
hey, i am having the same question. have you found out the solution?
this is so helpful! is it possible to use this method to get the links in the page?
Excellent Content!
Great tutorial! When I scraped data from a website the data was only scraped until a certain point, even though more yellow containers were highlighted. What is the issue here, does the scraping stop after a certain number of lines?
Victor. Same problem I am having. @dataslice can you comment?
Hello thanks for the awesome tutorial, however, how do you do this with a webpage you have to log in to get table info?
nicely done...
thank you , useful
Great content!!
I'm trying to scrape an Amazon list of Item Names & Prices but it will only return a list of 10 of the items... 🤷♂️
I still get #N/A ??? It worked for Craigslist but not for other sites i tried like Supermarkets?????
same, for prices like walmart
Hey mate! Great tute. Any idea how to get the info beyond a "More" button using these methods?
Taking the Craig's List one for example, If you wanted to see the top 300 results, if they were beyond a more button that loaded onto the current page and not on a "page=2" type thing.
Hey Brad, unfortunately if you want to do any kind of UI interaction on the page, you'll need to use a different web scraping method--something like the Chrome web scraper extension or the Selenium library in R or python.
Tell me the extension that is using for select all links in one time
importxml function is not working in google sheets. It is showing NA when trying import the data
Can you suggest a solution
Great content! How can i convert the info from text to numbers? (e.g. prices list)
Thanks! Maybe try the Format > Number tab for formatting an entire column
@@dataslice I tried, but it's impossible
@@juanmaguevara That's very odd, I'm able to format my scraped columns and am trying to think of why it wouldn't work for you. Maybe the scraped text data contains non numeric values and Sheets is unable to format it? I'm not too sure
Maybe I'm too late, and maybe it's a dumb answer, but in some cases works for me adding 0 to the text to convert it into numbers, if the text is just numeric
@@victorruiz804 thanks Victor!
Hey Thank you for the video!
Do you know to get the updated data.
Example if i am importing a stock price.
and i would like to import the updated data after 30 mins.
I can't think of a way other than manually refreshing the formula and cells, however, I do know that Excel supports getting data from stock tickers. You can write a ticker name in a cell, like $AAPL, and then go to the 'data' to format it as a stock ticker, and then fetch a lot of different data points about the stock -- it might be easier than scraping it!
This is AWESOME! Do you know if this is possible to do with a site that requires a login?
Can we use importxml function directly without using or downloading application or software to scrape data from any website?
what did you do to show the xpath??? you did not teach how to show this xpath in your video
Is it possible to have google sheets pull information from Search Engine results? For example, enter a business name, and it searches Google and pulls info for that company?
Will this update daily?
thanks
What if I want to scrape all of the images and their respective alt text or all of the h tags in order of their appearance on the page?
hey, i am having the same question. have you found out the solution?
Excellent video. Curious if you could help explain if this is exclusive for text or if numerical data can be extracted as well? If so, could you help coach on how to do that? I keep struggling to get anything but the text headers in a numerical data table that is non HTML tables. Thank you!
How to make it auto update/refresh result? Can i just Reload the google sheet tab
is there a way to automatically change the url. lets say like a item id at the end or the url to make a database?
Love it thanks for sharing! Do you have one on python by chance? I saw the one on R but am curious if you do anything with python.
I’m working on a python one now - thanks for watching!
@@dataslice Can't wait! These are awesome!!
thank you for the quality tutorial. i'm looking for a way to scrape data from SSRS to google sheet. is this possible? thanks
You're a beast.
how to extract data from multiple pages on yelp ? not just the first one
Hello I am an Amazon seller, do you think I could use this technique to retrieve my sales history directly in a Google sheet?
can you please guide me how to scrape skyscanner and kayak Best price in google sheet.
I tried exactly the same workflow as you but mine is giving me error. it's craiglist with home rental site.
How would this work (if at all) in youtube trying to scrape video data? Especially when it comes to tracking down the actual video ID and not the vanity URL? THANKS IN ADVANCE
Thanks for a great video on this subject! But this does not work for me. I get an "error" when try to input the second field in this example!
Hi, how many data row is it limited for importxml function?
Señor, usted SAPE.
Clear Explanation 👍
Questions
Will the important HTML Is up-to-date data from the source website? If no then please tell us a way to keep a live data
2- I want to scrap ecommerce website product data, how to auto scrap Next page ?
3- How about import data via json file url most e-commerce website uses it eg Shopify
I'll be thankful if you please create an ecommerce website data scraping vidoe or share your tips so ill give it a try 🙂
How is scrape pictures from Craigslist? Is there a way to scrape desired data from balance sheet from yahoo finance into google sheets?
Thank you so much for these tutorials! I think i'll use them in future. Not now, because.. i need to import comments from instagram, and...is there any way to do that? I guess insta won't let google sheet take data from it because it's not "logged in", and..yea.. i would love to hear any answers for it, even if that's a no :")
would love to hear how you would go ahead scraping dynamic pages that loads the content through java / api? I have some different solutions available: Scrapy, Octoparse, Selenium(Python), Java, or somehow retrieving it directly from the API. Could i do it with GraphQL? I need the data to get fed into a cell in a google sheets, i prefer not having to manually load it from a csv. i'm okay at sheets but not python/java
Tried doing this for rental units to find but just kept getting an error sadly
Can you import the images?
How do you use this function to scrape hyperlinks in the website?
How if we collect data from website with basic auth to spreadsheet
Can we use importxml to extract photos to Google Sheets? If so, what is the process?
Is this data updated automatically?
How to import the tables that filled with api data?
How come this only works for certain website? Eg. When I try to do this on a real estate website or supermarket website i always get the error #N/A?
I was trying to export data from scopus.com webpage
Thanks for this great tuto. when i'm trying to use on a realtor listing, google sheet result is "N/A", what did i make wrong ? thanks
Which site are you trying to scrape? Websites where the data is loaded dynamically sometimes don't cooperate with Google Sheets / other webscrapers and you may need a different approach
Same problem... I'm trying to scrape youtube.com. I'm watched this video ua-cam.com/video/pwZ44kAeiOo/v-deo.html&t where he scrape youtube with no effort, but right know it's seems it no working any more...
How can I get the URL link?
Can I use more than 1 url on a single sheet ?
Yep, you'd just need to make a new formula with the new URL in a different cell
Thank you ❤💯
- The ultimate goal is to create a google sheets to have a link to feed DataFeedWatch in order to create a product catalog for Facebook ads..
Is there a way to import the anchor tag instead of the URL when using //a/@href?
Are you trying to import the text between the ... tag?
@@dataslice Yes. Here is the element:
Aldersgate United Methodist Church
When using @href to import, it imports the hyperlink. Is there a way to import the anchor tag? Thanks
@@dataslice Figured it out, was using the wrong element. Thanks
@@peterhansen1351 how did you do it. I have been trying to import a similar text too
Do you know if it's possible to tell Google sheet to scrap data from a specific location? I tried using those commands, but it was sending me data from United States whilst the page update automatically depending on the country you're accessing it from, though the URL remains the same.
How would you do this with links
I just tried this trying to scrape google play store and failed. is this possible to scrape google play store reviews? please help
How do you scrape data from a website that is behind a paywall?
why is this failing with youtube links?
UA-cam dynamically generates content on the page with javascript -- in other words, the page essentially loads and is blank and then the content is populated after the fact. Google Sheets (and other static web scrapers) can only scrape the page if the content is there on the initial page request, but it's unfortunately not able to if the content is generated after.
I'm having a hard time scraping data from skybox. hopefully this helps
hi there can you help me how to collect data from 'BURSA'?. such as stock price and so on. i already tried all the methods but it did not work
The =IMPORTXML(B2,B3) isent working for me, the numbers just go grey . Anyway to fix this??
Thanks for the awesome video! But how to find the right xpath from youtube? I try SelectGaghets extention, but he gives me a Error:
Imported Xml content can not be parsed. Or Error
"Imported content is empty" only "//a" xpath works for me...
hey man, having trouble scraping yahoo finance onto a spreadsheet, can you help?
Can you make a video about importing data from fb messenger into R ? I tried selector gadget but it didnt work . Thank you for those amazing tricks
Facebook actually lets you export and download your messenger data, I’d recommend trying that!
@@dataslice yes but the file is in json or html format, and i dont know how to tràner them into csv
SelectorGadget doesn't have an icon to click to activate after i Installed on chrome. Is there a Firefox equivalent?
Also, how would you recommend scrapping home data from Redfin/Zillow? I would like to paste in links and automatically fill in home data row by row for different homes. For the SF for example, i tried used //div[@class='info-block sqft'] but it doesn't work (shows N/A)
Mine worked fine..
How to get data in Google Sheet from a website after login?
Regular devtools has right click on element > copy > copy xpath