Thinking about learning SEO, but I agree with the idea here These LLM systems will need high-quality/authority content to serve in their responses, and focusing on ranking for that will be important in the future.
@@ExposureNinja what I mean is a lot of GEO is created by indicators of good SEO. EEAT, clicks, etc. if people rely on GEO will the indicators from SEO go away?
Considering just how often AI is wrong, or even totally just makes stuff up, I'm staggered at how many people are using it as a search engine replacement.
It's the convenience. Most chatbot searches offer either more depth or a concise response, which is what people are looking for. Plus, they're just looking for info, which Google often hides below **many** ads.
It's not as wrong as you think it is. Feeding it garbage information is what causes it to spit out garbage. Sure the occasional fall out lie does happen with made up sources, but how is that any different than humans, organizations, or media giants doing that.
Thinking about learning SEO, but I agree with the idea here
These LLM systems will need high-quality/authority content to serve in their responses, and focusing on ranking for that will be important in the future.
Yes, exactly. As a side note. it'll be interesting to see how they judge high-quality/authority!
@@ExposureNinja Yes, that's a great question.
Probably shouldn't be something it could generate without citing you.
Astonishing content man 🔥🔥❤
You're too kind!
Always value your content.
Thanks David!
"should you eat yellow snow" 😆
🍧
It’s interesting to see that good SEO leads to good GEO. But I wonder what happens when people keep using GEO, does the feedback loop break?
How do you mean, sorry?
What do you mean ?
@@ExposureNinja what I mean is a lot of GEO is created by indicators of good SEO. EEAT, clicks, etc. if people rely on GEO will the indicators from SEO go away?
@@AvantiMusicNJ the indicators will be shared, one and the same.
Considering just how often AI is wrong, or even totally just makes stuff up, I'm staggered at how many people are using it as a search engine replacement.
It's the convenience. Most chatbot searches offer either more depth or a concise response, which is what people are looking for. Plus, they're just looking for info, which Google often hides below **many** ads.
It's not as wrong as you think it is. Feeding it garbage information is what causes it to spit out garbage. Sure the occasional fall out lie does happen with made up sources, but how is that any different than humans, organizations, or media giants doing that.
That's an oversimplification. It's not a replacement; it's a complementary approach. Like a fork is not a replacement for a spoon.