jokes aside, he's not a bad looking dude at all, getting jacked wouldn't hurt tho, as long as he doesn't covert to a life coach and talk about it non-stop like that other coder youtuber John Sonmez.
@@TechdubberStudios It may pay less than ads, but it's many times better. I support websites that responsibly use cryptomining, and I block ads. Please, don't say that ads are better. They have never been any good to anybody's web browsing experience. Oh, and you can use cryptomining along with Arc, another earning method that does not involve ads. I'm done with Google's creepy trackers. Cryptocurrency mining is the future.
@@ezshroom I am genuinely 100% with you on the crypto movement. I hate ads. Always have hated them. But there are at least 2....3 big corporations that come to mind that were built on the ads business model, but with crypto mining... can't find one. And browser-crypto-mining is not exactly a new technology. I really want it to replace ads. I really do. Hate the pop-ups, spying, tracking, that's going on. And the first corpo that comes to mind would be Netflix, when considering whom should adopt the crypto model. Because the users stay on netflix and binge-watch hours and hours!
@@ezshroom also, do you happen to know any website/forum/subreddit focusing on browser-based mining? I would really like to join and dig in more into this subject.
I avoided serverside rendering a meta tag by registering a sub-domain, doing the serverside-rendering there and making my app only compatible with a set number of user-agents. Brilliant!
Fantastic job explaining this! As always, the hilarious dry humor and "next level" metaphors help drive home points and keep things entertaining. Really helped clear up a bunch of stuff and get me pointed in the right direction. Many thanks!
Also check out the create-exact-app npm (that's exact not react). Like NextJS but Express-forward design, full control at the server side level of what's going on.
@@jpsimons Just FYI, next.js also gives you full server side control. You can just run next as a library within an express server. In my experience, it's super ergonomic while preserving the state-of-the-art benefits of next (code splitting, automatic static optimization, incremental static generation, etc.). Having said that, I have not yet checked out create-exact-app, and am not sure how it differs from nextjs.
Lol just killed the whole argument. Never heard of it before. Just goes to say that tech is exponential. Wonder if it will cause the cosmic crash eventually.
there is a workaround : just add conditional tag in the small server that builds your page. you can still use client side rendering except for meta tags
Our course -- I followed and understood all the way to the end. This is because I'm an unemployed ex-Tech Lead [who has never worked at a FANG company], and a thousandaire.
Greate video! I use Laravel on the server side to serve up everything. Static html pages and React apps or a combo of both. It's easy to embed a react app within a .blade template file. Meanwhile Laravel takes care of everything else, like API services, user registration and authentication, etc. Best of both worlds.
A solution to your problem could be to build a single page application, with each end point for the app being pre rendered. It's basically jamstack. Once a user loads one page, the others do not need to be loaded.
7:20 would it be possible to use the same url but check the header value on the for example nginx server? like is this user agent a bot (twitter, fb, etc.) => proxy to your slim API for only the meta data response and if its a real user (mac, windows, chrome, firefox user agent etc.) => proxy to your real page / default response / SSR page. maybe im forgetting something. i dont know if this could work.
I had to do that once, I used a Lambda function since it was hosted on AWS, and the function intercepts the CloudFront distribution request and updates the HTML if the request comes from a robot, adding the OpenGraph tags.
Had this issue while using a MeteorJS website running with ReactJs as the client side is that we created a crawler ( i think there is an npm project for this) that would go to every page, render it and save it on our DB. When a non-human (Google) would access the site it would served this rendered HTML, making it SEO friendly. Basically our server would use the User Agent to define what type of content the user would get served. Hope this help
can you please explain what do you mean by 'render it and save it on db' , do you mean like render the dom elements and attach to html and store it in some link and add that link in db , or what exactly are you storing in db, if this is the case wouldn't it be too much for pages which are dynamic like /rob/photos, /sam/photos and likewise to be stored in db or am i missing something
If you manage the web server, you could use the web server's router to do the same exact hack you described without the need for a different subdomain, just a route that checks the user-agent of the client and returns different HTML based on it.
I think your points on pre-rendering are slightly off. Not mentioning tools like Gatsby, Scully, or Gridsome is a miss as they can be used to render those dynamic routes you say cannot be pre-rendered. It's worth mentioning that JAMStack options are becoming really incredible for developers and give you the same benefits as server side rendering out of the box.
Why do you want another domain for that. The server that delivers the app index.html could also deliver the meta response instead, based on user agent. One problem in both situations you could get (separate service or combined), is that the bots make checks whether the content they see is different to a regular user. Whether any of them does it, I don't know, but I would check this possibility.
2:05 i do it this way: Server serves response for parsers (meta, og, schema, jsonld and plain html content) and then comes along js that structures it up and takes over routing from this point, so when you navigate you actually don't "refresh"
I think he meant that we should use Next / Nuxt from the beginning. I used to face these problems and since then I use Nuxt for every project and never worry about these problems again
my solution for this is: 1. set a lambda as an index 2. on the index lambda, it should run a headless tool (like a headless chrome) to render the page if the request is a bot, else just serve the raw js
This video is 2 years old, I don't know if it existed when you made it but today there is a tool called puppeteer that uses a headless Chrome browser to actually pre-render your HTML that you can feed to bots and still feed the client side rendered to humans
I had this problem once but my focus was towards crawlers. I ended up using some php to "render" the important bits like title, descriptions and links. Then the javascript would remove those elements and do the single page app business. It was back in carrot farmer code days but I'm sure happy coders can accomplish this just as well.
Nice post. However I am curious as to why these FB bots can't read a client rendered app? I am sure that these bots are ultimately rendering the app on a web engine/browser, because they need to generate the preview of the rendered content. The static html will also most likely load some resources like CSS and JS over the network (unless the bot expects an inlined css/js), so it's not going to be an instant preview. So maybe the limitation is not technical but rather functional, in that the bot is probably not willing to wait too long for the page to load (which is the case with most client rendered apps)?
I just put my meta tags with variables like %PAGE_NAME% %PAGE_IMAGE% and replace these later while serving the page with express. doesn't work while client-side routing but It works for link previews.
I think you could easily do this in net core. In the startup class, in the configure for routing, you could filter each route with the correct meta tags. You could make this an extension and bing bang bosh, neat tidy job done
netlify has a free experimental feature called pre-rendering, for me, it works with Facebook, it parses the right meta tags automatically with pictures also. My content comes from a backend via graphql and apollo. meta is being set with react helmet, the page is handled by react-router, and it's a create react app project. Hope this helps. You can also do prerendering very very easily with react-snap package, but you need to rebuild when data changes. (PS. Thanks for your work, I really like your videos)
Nice solution. Only downside guess would be that Google wants you to show the same content to their bot as the user. But probably doesn't matter if you don't want to index those pages.
Your solution at the end is valid. Use reverse proxy to detect the request, and forward them appropriately. However, it's best to use SSR from the beginning if that's your intention.
Now you can make static pages for you dynamic frequently updated pages with Nextjs, How it works's is that it looks at the requested page and if it is present from the build time, sends it back and if it is not built during build time, builds on the go(run time) and adds it to the built pages for the next request. pretty amazing and game changing !
I just used a node.js express server to host compiled create-react-app, this way you can modify the page and add meta tags if needed before serving the page. Sort of a mix of server and client side rendering as he said.
This helped me alot! I am working on a project and my backend was almost finished. I was using create-react-app with router but switched over to next.js! Thanks alot
In your example of the sub domain. I think that’s not needed / don’t need to redirect. If you are using a server side program it can decide to return the react app or the preview on the normal domain using the same user agent logic or an IP address list of known Facebook servers.
It works! I do exactly that with my react web SPAs. I use firebase and cloud functions to detect user agents and serve SSR version on the fly to robots and CSR version to users. This is also important to SEO indexing, cause some robots won't run any JS and expect html-only responses. Really enjoy your videos.
@@leisiyox I thought about using it, but never tried it. Don't know how well it works. It would also cost more than my current firebase cloud function solution.
@@leisiyox I like that they integrate lots of services in a single solution. When you create a firebase project, you instantly have access to file storage, hosting, database, authentication, and some other stuff that makes it really easy. I also like that Firestore has real-time listeners for your data. Really good for a client side rendered app. Also really like their documentation and the fact that you can easily access other Google Cloud services and API. There are many videos online about it. Check it out.
There are benefits to client side rendering? Btw rather than checking the user agent, just include a client side redirect location="whatever.com" Because the bots dont have javascript they wont be redirected, but users will
If the only thing that needs to change is the meta tags (not the rendered bits), you can also modify the html before returning it to the client, inserting the relevant meta tags. It will probably lead to performance problems, but you could also perform an if condition on the referrer of the request to determine if you should perform such modifications.
I wonder how you'll feel about Blazor, ASP.NET CORE released Blazor in order to do everything in C# and replace Javascript entirely ( I just started learning about it but it sounds like what they're doing, could be wrong)
The problem I see with the magic link is that when a user just copies the url instead of pressing a share button to get the special share.* link that there will not be meta tags and when you do use the special share.* then there are meta tags. And I think most users will probably just copy the url they are on now instead of looking for a share button
kinda curious what the increased load would be if you'd do this on the normal domain already -> check the user agent... Would eliminate cdn usage tho... so maybe not foolproof.
I think I might try react snap. That sounds good. Pre rendering on every build. Because often the layout of a page is the same even of the content changes. What do I mean by that: every reddit post will have the logo, the side bar, the footer and a div in the middle which contains the contents of the post. So you can prerender all that with an empty div, and then Hydrate it. Even with user generated content (as long as it is simple and consistent) you could prerender. Thanks for the video
You can use the header trick as you mentioned, and then simply use something like puppeteer to load the page on the server itself and then send the rendered HTML page to the client. If the header says it's from a normal user, then don't go to puppeteer, just return your usual index.html file.
I'm not a fan of server side unless it's to generate a HTML based email. Just seems so cluttered when doing work. Even the emails I generate on the server side are dreadful to work with but barrable since there aren't many.
I done this in a similar way. Except the share link page didn't check any user agent headers. It just featured a JavaScript redirect to the normal URL. Bots don't run JavaScript, don't redirect, they get the meta tags they need. User does run JS, redirects to the main SPA. And, for good measure, a fallback clickable link on the share page shows in case redirecting is blocked.
Doesn't Netlify have the option to pre render single page applications for bots? (Originally for a better SEO) I'm not sure about this because I have never used this feature before, but shouldn't it also be able to solve this problem?
hi ben i dont know what you are talking about, i am addicted to listen to you, may be it will start making sense someday, i am still learning react and some other frontend libraries.
Hello Ben, Have You thought about differences between react-snap and NextJS/Gatsby from SEO perspective? I mean is there reason to use NextJS instead just react-snap to get better results in search engines? Does NextJS/Gatsby do something extra to perform better in SEO? Regards
Another problem I struggle frequently with client side rendering is content indexing by search engines. There is a chance that Goggle will index js-generated content, but it depends. Other search engines will not even give a try.
I would use the same client-side bundle BUT adding a little bit of logic on the static assets server to add the meta tags to the HTML shell that embeds the client-side bundle. That way, you won't need to implement HTTP redirects, AND probably is better once you start working with deep-links for a mobile app.
been struggling for two days to understand what are SSR and SSG and CSR and i literally just got every thing on a 10 min video about big mac and ice creem
Would the twitter/fb crawler follow a redirect? Or would it just grab the meta data and come back? In the latter case the user agent check wouldn't be required. Also, your users are going to grab the url from the browser bar and probably not user the share link you made special for them
Another option would be to have a CDN and when you get content updates just render that single page to the CDN. Then you aren't having to rerender all the time. It would get more complex than that of course as you would need to rerender everything if you make a style change to your website. And it would be public. Unless you pay for something to make it private and have the CDN only return a partial page for bots.
Hey Ben, I'm starting a new react web app for something like delivery.com and doordash.com where SEO is a major thing. Like rich structured snippets of stars etc in google results. Is SSR the only option?
In that case Gatsby or NextJS or any other you recommend? I don't know if Gatsby will do but their web.dev score hits 💯 so was thinking that but in a dilemma
@@bawad do you know an easy way to migrate from create react app to next.js? I've found problems with REDUX, svg images and react router active route (NavLink). Great video btw ;)))
Well a simpler but similar way would be to always server side generate the index.html but only to change the meta data while letting the index.js generate the actual content. This wouldn't hurt human users and is pretty light on the server (only entry point html meta tags need to be generated depending on the link then the rest of the rendering navigation is done client side)
I liked the idea however, I think it is still good to have SSR for all users or maybe SEO as well. Would you share your ideas on uFrontends, too? Are you preparing some sort of tutorial on that or what?
hey, can you talk about heroku, hasura and the likes? all this cloud tooling seems like the way to go to become a one full stack pony show, but I would love your opinion on it, thanks!
english is not my first language so i understanded only on second time i watched this, thanks for de vid i've not dev a site with link preview yet, very good to know it!
Just make a simple API where the client can send urls and it will return a preview. Just make sure to cache the pages and add rate limits so they can’t spam your api.
Huh thats an interesting problem. You could, of course, only SSR the meta tags. Would be a nice little project to work on an ssr-meta-tag-proxy. Wonder how that would work
I think one issue is that most of the link preview code nowadays is kind of stuck in a pre-1page application world. Apart from making a curl request to the website and getting the HTML, the preview feature should also render it and stop assuming everybody's using server-side rendering. This does sound like it gets more into the "create a google scraper" type of situation, but if in-turn we're talking about a client-side rendered website trying to create a preview for another client-side rendered website, all the rendering is happening on the client so the server doesn't take much load at all.
Going through a similar issue myself. My static site is hosted on S3 / CloudFront, orchestrated by terraform. My plan is to use CloudFront Origin Response triggers to trigger a lambda function to add the correct open graph tags to the response. I think this is the lightest weight option.
Yo Ben, just saw this video so my comment may be irrelevant by now but for a simpler solution could you just wrap your existing create-react-app codebase in an express app then serve the CRA to actual users and generate html for FaceBook bots etc all from one codebase? I imagine your express app would be super small so wouldn't be adding much weight to your existing codebase. You could then build your API into that express app too and treat it as a monorepo. I do see that this would completely negate the benefits of serving static sites via gh pages or Netlify but if you've got an active API anyways is it a large extra load for that API to serve your static assets as well? If you're using Lambda functions then fine, you win. Interested to hear your thoughts. Cheers :)
Ben Awad yeah fair. Maybe worth considering if you want to make it more static/ssr’ed in future but by then you might consider the full next.js recode that you discussed in the vid.
For some wierd cases like mine where only /particularRoute need to work like SSR what i actually tried was hosted gatsby project in particular route of CRA project and it worked just need to handle few re-routing cases
Woah! My self esteem skyrocketed because I managed to keep up with you until the end :D Aside from that, your content is top notch, keep it coming man.
Your metaphors are next level.
here's your big mac.
Aaaah I see what you did there...
🎈 🏠 🎈
"you are not a karen"
Ben's got 99 problems, but a girlfriend ain't one.
Bruh! He says it with a straight face
@@phantomKE I know right? He just leveled up with these jokes!
🤣🤣
jokes aside, he's not a bad looking dude at all, getting jacked wouldn't hurt tho, as long as he doesn't covert to a life coach and talk about it non-stop like that other coder youtuber John Sonmez.
He has a girlfriend in Canada.
I avoid client-side rendering in order to save CPU cycles for cryptocurrency mining.
hahaha
hilarious comment! but crypto mining is an inefficient form of revenue on client's computer, see TPB case experiment.
@@TechdubberStudios It may pay less than ads, but it's many times better. I support websites that responsibly use cryptomining, and I block ads. Please, don't say that ads are better. They have never been any good to anybody's web browsing experience.
Oh, and you can use cryptomining along with Arc, another earning method that does not involve ads.
I'm done with Google's creepy trackers. Cryptocurrency mining is the future.
@@ezshroom I am genuinely 100% with you on the crypto movement. I hate ads. Always have hated them. But there are at least 2....3 big corporations that come to mind that were built on the ads business model, but with crypto mining... can't find one. And browser-crypto-mining is not exactly a new technology. I really want it to replace ads. I really do. Hate the pop-ups, spying, tracking, that's going on. And the first corpo that comes to mind would be Netflix, when considering whom should adopt the crypto model. Because the users stay on netflix and binge-watch hours and hours!
@@ezshroom also, do you happen to know any website/forum/subreddit focusing on browser-based mining? I would really like to join and dig in more into this subject.
solutions:
0) pre-rendering with parcel or webpack
1) server side rendering
your solutions are not client side rendering. he mentioned it.
I avoided serverside rendering a meta tag by registering a sub-domain, doing the serverside-rendering there and making my app only compatible with a set number of user-agents. Brilliant!
Fantastic job explaining this! As always, the hilarious dry humor and "next level" metaphors help drive home points and keep things entertaining. Really helped clear up a bunch of stuff and get me pointed in the right direction. Many thanks!
I love the tint on your glasses, it's serial killer-ish, where can i get a pair like those?
a package arrives at your door after the 3rd kill
@@bawad respect
They are the left-behinds after each kill. That's the way you get it.
@@bawad quick scope no scopes?
Those tints are wiped off blood from killing
Solution: NextJS, Angular Universal, Nuxt, etc.
Also check out the create-exact-app npm (that's exact not react). Like NextJS but Express-forward design, full control at the server side level of what's going on.
@@jpsimons Just FYI, next.js also gives you full server side control. You can just run next as a library within an express server. In my experience, it's super ergonomic while preserving the state-of-the-art benefits of next (code splitting, automatic static optimization, incremental static generation, etc.). Having said that, I have not yet checked out create-exact-app, and am not sure how it differs from nextjs.
Why do I not like the sound of Angular Universal?
@@angshu7589 because you are not carrot farmer. Although color of your profile picture kinda resembles the carrot :D
@Adithya R Svelte Sapper is still in early development. I love Svelte, but Sapper is still far away from production-ready
Love the joke about girlfriend and client side rendering at the beginning
I love how Ben roasts Angular devs. I thought of that carrot farmer line off and on all day and cracked up every time.
For your sake and ours, I hope you DON'T get a girlfriend too soon.
I guess u never heard of prerender.io
been using it for years
Lol just killed the whole argument. Never heard of it before. Just goes to say that tech is exponential. Wonder if it will cause the cosmic crash eventually.
yep yep yep , you just commented before me
@@ayushkhanduri2384 same case, I just searched prerender before I add a comment about it to check if it's already mentioned and here it was.
🤯
there is a workaround : just add conditional tag in the small server that builds your page. you can still use client side rendering except for meta tags
Our course -- I followed and understood all the way to the end. This is because I'm an unemployed ex-Tech Lead [who has never worked at a FANG company], and a thousandaire.
Greate video! I use Laravel on the server side to serve up everything. Static html pages and React apps or a combo of both. It's easy to embed a react app within a .blade template file. Meanwhile Laravel takes care of everything else, like API services, user registration and authentication, etc. Best of both worlds.
A solution to your problem could be to build a single page application, with each end point for the app being pre rendered.
It's basically jamstack. Once a user loads one page, the others do not need to be loaded.
the girlfriend problem might be solved if you stop walking around wearing asexual flag shirts
hahaha
lmao, good catch, respect
He's just playing hard to get. Karen gets it.
But with this if he ever gets one, she will be the right one. Lol
He do check a lot of aesthetic boxes from the virgin meme... Though I probably do too 😆
7:20 would it be possible to use the same url but check the header value on the for example nginx server?
like is this user agent a bot (twitter, fb, etc.) => proxy to your slim API for only the meta data response
and if its a real user (mac, windows, chrome, firefox user agent etc.) => proxy to your real page / default response / SSR page.
maybe im forgetting something. i dont know if this could work.
I had to do that once, I used a Lambda function since it was hosted on AWS, and the function intercepts the CloudFront distribution request and updates the HTML if the request comes from a robot, adding the OpenGraph tags.
Sapper + svelte gives you the best of both worlds
This was the best explanation video I've seen on the matter... Kudos to you Mister...
Sorry I'm new here, and I've noticed that Ben clearly hates Angular.
Can someone give a quick background please???
Very useful, thank you for pointing to react-snap. Happy Hacking Ben 🙌🏻
I was just watching one of your videos on react native animation earlier xD
Keep up the good job 🔥
"It's like I spent a bunch of time building a house and now I want that house to fly." LMAO
this is actually the first time I understood the difference between client side and server side rendering
Had this issue while using a MeteorJS website running with ReactJs as the client side is that we created a crawler ( i think there is an npm project for this) that would go to every page, render it and save it on our DB. When a non-human (Google) would access the site it would served this rendered HTML, making it SEO friendly. Basically our server would use the User Agent to define what type of content the user would get served. Hope this help
can you please explain what do you mean by 'render it and save it on db' , do you mean like render the dom elements and attach to html and store it in some link and add that link in db , or what exactly are you storing in db, if this is the case wouldn't it be too much for pages which are dynamic like /rob/photos, /sam/photos and likewise to be stored in db or am i missing something
If you manage the web server, you could use the web server's router to do the same exact hack you described without the need for a different subdomain, just a route that checks the user-agent of the client and returns different HTML based on it.
Nice GatsbyJS colorway on that shirt 🤙
I think your points on pre-rendering are slightly off. Not mentioning tools like Gatsby, Scully, or Gridsome is a miss as they can be used to render those dynamic routes you say cannot be pre-rendered. It's worth mentioning that JAMStack options are becoming really incredible for developers and give you the same benefits as server side rendering out of the box.
Why do you want another domain for that. The server that delivers the app index.html could also deliver the meta response instead, based on user agent. One problem in both situations you could get (separate service or combined), is that the bots make checks whether the content they see is different to a regular user. Whether any of them does it, I don't know, but I would check this possibility.
2:05 i do it this way:
Server serves response for parsers (meta, og, schema, jsonld and plain html content) and then comes along js that structures it up and takes over routing from this point, so when you navigate you actually don't "refresh"
I think he meant that we should use Next / Nuxt from the beginning. I used to face these problems and since then I use Nuxt for every project and never worry about these problems again
I had similar issue. good thing you found a better solution.
my solution for this is:
1. set a lambda as an index
2. on the index lambda, it should run a headless tool (like a headless chrome) to render the page if the request is a bot, else just serve the raw js
The preview still won't work when users copy paste the link directly from the browser url bar
Gatsby also solves the React single-page problem, since we can generate all the individual HTML, CSS, and JS pages.
Why wouldn’t you just serve this over the server on a different port? As in send the result as a when someone searches that route?
Hey I saw Wes Bos in one of his videos, he used cloud functions to generate the preview and puppeteer i guess to take a screenshot of the url
This video is 2 years old, I don't know if it existed when you made it but today there is a tool called puppeteer that uses a headless Chrome browser to actually pre-render your HTML that you can feed to bots and still feed the client side rendered to humans
I had this problem once but my focus was towards crawlers. I ended up using some php to "render" the important bits like title, descriptions and links. Then the javascript would remove those elements and do the single page app business. It was back in carrot farmer code days but I'm sure happy coders can accomplish this just as well.
Nice post. However I am curious as to why these FB bots can't read a client rendered app? I am sure that these bots are ultimately rendering the app on a web engine/browser, because they need to generate the preview of the rendered content. The static html will also most likely load some resources like CSS and JS over the network (unless the bot expects an inlined css/js), so it's not going to be an instant preview. So maybe the limitation is not technical but rather functional, in that the bot is probably not willing to wait too long for the page to load (which is the case with most client rendered apps)?
I just put my meta tags with variables like %PAGE_NAME% %PAGE_IMAGE% and replace these later while serving the page with express. doesn't work while client-side routing but It works for link previews.
The problem with client side rendering is mostly that most of the time it's used for something that doesn't need it. Most websites are mostly static.
I think you could easily do this in net core. In the startup class, in the configure for routing, you could filter each route with the correct meta tags. You could make this an extension and bing bang bosh, neat tidy job done
netlify has a free experimental feature called pre-rendering, for me, it works with Facebook, it parses the right meta tags automatically with pictures also. My content comes from a backend via graphql and apollo. meta is being set with react helmet, the page is handled by react-router, and it's a create react app project. Hope this helps. You can also do prerendering very very easily with react-snap package, but you need to rebuild when data changes. (PS. Thanks for your work, I really like your videos)
Nice solution. Only downside guess would be that Google wants you to show the same content to their bot as the user. But probably doesn't matter if you don't want to index those pages.
Facebook's user agent is there for the facebook app browser as well.
Your solution at the end is valid. Use reverse proxy to detect the request, and forward them appropriately.
However, it's best to use SSR from the beginning if that's your intention.
Now you can make static pages for you dynamic frequently updated pages with Nextjs, How it works's is that it looks at the requested page and if it is present from the build time, sends it back and if it is not built during build time, builds on the go(run time) and adds it to the built pages for the next request. pretty amazing and game changing !
Great video. Trying to wrap my head around server side rendering and this video definitely helped
I´ve been watching your videos and yes, the quality of the content is always awesome, new suscriber
my reaction to this video is LOL. as someone already mentioned, you could've done a decent server side routing
I just used a node.js express server to host compiled create-react-app, this way you can modify the page and add meta tags if needed before serving the page. Sort of a mix of server and client side rendering as he said.
Rails is this the best framework. Right i build all my rails views with react, but with rails server rendered meta tags.
This helped me alot! I am working on a project and my backend was almost finished. I was using create-react-app with router but switched over to next.js! Thanks alot
You make my day better
has there been a followup to this? video/repo?
I use EJS and it allows for variables to be passed before sending the HTML to the client, so that can allow you to change the values in the meta tags.
In your example of the sub domain. I think that’s not needed / don’t need to redirect. If you are using a server side program it can decide to return the react app or the preview on the normal domain using the same user agent logic or an IP address list of known Facebook servers.
You ever tried NextJS? It does SSR for the initial request (b/c it could be a bot), but CSR when you click links.
This channel is slowly becoming one of my favorites on UA-cam! 😄
It works! I do exactly that with my react web SPAs. I use firebase and cloud functions to detect user agents and serve SSR version on the fly to robots and CSR version to users. This is also important to SEO indexing, cause some robots won't run any JS and expect html-only responses. Really enjoy your videos.
What about some prerender.io ?
@@leisiyox I thought about using it, but never tried it. Don't know how well it works. It would also cost more than my current firebase cloud function solution.
@@cauebahia what are the conditions that you recommend using firebase?
I thought about using it but I seek guidence
@@leisiyox I like that they integrate lots of services in a single solution. When you create a firebase project, you instantly have access to file storage, hosting, database, authentication, and some other stuff that makes it really easy. I also like that Firestore has real-time listeners for your data. Really good for a client side rendered app. Also really like their documentation and the fact that you can easily access other Google Cloud services and API. There are many videos online about it. Check it out.
Seriously a balloon helicopter, that was the best analogy you could come up with ?
There are benefits to client side rendering?
Btw rather than checking the user agent, just include a client side redirect location="whatever.com"
Because the bots dont have javascript they wont be redirected, but users will
If the only thing that needs to change is the meta tags (not the rendered bits), you can also modify the html before returning it to the client, inserting the relevant meta tags.
It will probably lead to performance problems, but you could also perform an if condition on the referrer of the request to determine if you should perform such modifications.
I wonder how you'll feel about Blazor, ASP.NET CORE released Blazor in order to do everything in C# and replace Javascript entirely ( I just started learning about it but it sounds like what they're doing, could be wrong)
Subbed for the consistent Angular claps 💀
The problem I see with the magic link is that when a user just copies the url instead of pressing a share button to get the special share.* link that there will not be meta tags and when you do use the special share.* then there are meta tags. And I think most users will probably just copy the url they are on now instead of looking for a share button
I had the same issue last week and also was thinking about moving to nextjs, but having a separate domain and server makes a lot more sense.
well Next js or any other SSR solutions doesn't mean you're gonna use one server for the backend and the front-end.
kinda curious what the increased load would be if you'd do this on the normal domain already -> check the user agent... Would eliminate cdn usage tho... so maybe not foolproof.
I think I might try react snap. That sounds good. Pre rendering on every build. Because often the layout of a page is the same even of the content changes. What do I mean by that: every reddit post will have the logo, the side bar, the footer and a div in the middle which contains the contents of the post. So you can prerender all that with an empty div, and then Hydrate it. Even with user generated content (as long as it is simple and consistent) you could prerender. Thanks for the video
o-o anything wrong with just using twig/blade and just use vue compiled version when you want shiz to update in real time?
You can use the header trick as you mentioned, and then simply use something like puppeteer to load the page on the server itself and then send the rendered HTML page to the client. If the header says it's from a normal user, then don't go to puppeteer, just return your usual index.html file.
I got this idea from here: ua-cam.com/video/lhZOFUY1weo/v-deo.html
You are hilarious and informative my dude haha, relatable. And damn dude the lenght of your link
I'm not a fan of server side unless it's to generate a HTML based email. Just seems so cluttered when doing work. Even the emails I generate on the server side are dreadful to work with but barrable since there aren't many.
I done this in a similar way. Except the share link page didn't check any user agent headers. It just featured a JavaScript redirect to the normal URL.
Bots don't run JavaScript, don't redirect, they get the meta tags they need. User does run JS, redirects to the main SPA. And, for good measure, a fallback clickable link on the share page shows in case redirecting is blocked.
Doesn't Netlify have the option to pre render single page applications for bots? (Originally for a better SEO)
I'm not sure about this because I have never used this feature before, but shouldn't it also be able to solve this problem?
hi ben i dont know what you are talking about, i am addicted to listen to you, may be it will start making sense someday, i am still learning react and some other frontend libraries.
Hello Ben, Have You thought about differences between react-snap and NextJS/Gatsby from SEO perspective? I mean is there reason to use NextJS instead just react-snap to get better results in search engines? Does NextJS/Gatsby do something extra to perform better in SEO? Regards
Another problem I struggle frequently with client side rendering is content indexing by search engines. There is a chance that Goggle will index js-generated content, but it depends. Other search engines will not even give a try.
Cool stuff about magic links. Even if you hadn't talked about that, you mentioned 🥔. Automatic upvote.
I would use the same client-side bundle BUT adding a little bit of logic on the static assets server to add the meta tags to the HTML shell that embeds the client-side bundle. That way, you won't need to implement HTTP redirects, AND probably is better once you start working with deep-links for a mobile app.
been struggling for two days to understand what are SSR and SSG and CSR and i literally just got every thing on a 10 min video about big mac and ice creem
Would the twitter/fb crawler follow a redirect? Or would it just grab the meta data and come back? In the latter case the user agent check wouldn't be required. Also, your users are going to grab the url from the browser bar and probably not user the share link you made special for them
Another option would be to have a CDN and when you get content updates just render that single page to the CDN. Then you aren't having to rerender all the time. It would get more complex than that of course as you would need to rerender everything if you make a style change to your website. And it would be public. Unless you pay for something to make it private and have the CDN only return a partial page for bots.
Why not just use React Heelmet with the client agent trick and rendertron
?
Hey Ben,
I'm starting a new react web app for something like delivery.com and doordash.com where SEO is a major thing.
Like rich structured snippets of stars etc in google results.
Is SSR the only option?
If your content could be pre-generated (for example, blog), one of the options would be to use Gatsby.js
yeah
In that case Gatsby or NextJS or any other you recommend?
I don't know if Gatsby will do but their web.dev score hits 💯 so was thinking that but in a dilemma
Next.js
@@bawad do you know an easy way to migrate from create react app to next.js? I've found problems with REDUX, svg images and react router active route (NavLink). Great video btw ;)))
My company has a massive CRA project that could benefit a conversion to NextJS, but we rely so much on browser globals outside of effects
Well a simpler but similar way would be to always server side generate the index.html but only to change the meta data while letting the index.js generate the actual content.
This wouldn't hurt human users and is pretty light on the server (only entry point html meta tags need to be generated depending on the link then the rest of the rendering navigation is done client side)
I liked the idea however, I think it is still good to have SSR for all users or maybe SEO as well.
Would you share your ideas on uFrontends, too? Are you preparing some sort of tutorial on that or what?
I was experimenting with client side rendering in 2012, it would generate the page based on json
hey, can you talk about heroku, hasura and the likes? all this cloud tooling seems like the way to go to become a one full stack pony show, but I would love your opinion on it, thanks!
english is not my first language so i understanded only on second time i watched this, thanks for de vid i've not dev a site with link preview yet, very good to know it!
Just make a simple API where the client can send urls and it will return a preview. Just make sure to cache the pages and add rate limits so they can’t spam your api.
Huh thats an interesting problem. You could, of course, only SSR the meta tags. Would be a nice little project to work on an ssr-meta-tag-proxy. Wonder how that would work
I think one issue is that most of the link preview code nowadays is kind of stuck in a pre-1page application world. Apart from making a curl request to the website and getting the HTML, the preview feature should also render it and stop assuming everybody's using server-side rendering. This does sound like it gets more into the "create a google scraper" type of situation, but if in-turn we're talking about a client-side rendered website trying to create a preview for another client-side rendered website, all the rendering is happening on the client so the server doesn't take much load at all.
Going through a similar issue myself. My static site is hosted on S3 / CloudFront, orchestrated by terraform. My plan is to use CloudFront Origin Response triggers to trigger a lambda function to add the correct open graph tags to the response. I think this is the lightest weight option.
Did you do it? Am exploring some of this ideas :)
Yo Ben, just saw this video so my comment may be irrelevant by now but for a simpler solution could you just wrap your existing create-react-app codebase in an express app then serve the CRA to actual users and generate html for FaceBook bots etc all from one codebase? I imagine your express app would be super small so wouldn't be adding much weight to your existing codebase. You could then build your API into that express app too and treat it as a monorepo.
I do see that this would completely negate the benefits of serving static sites via gh pages or Netlify but if you've got an active API anyways is it a large extra load for that API to serve your static assets as well? If you're using Lambda functions then fine, you win.
Interested to hear your thoughts. Cheers :)
I could do it that way, but I only needed it one specific thing, so I wanted to try keeping the benefits of using a CDN
Ben Awad yeah fair. Maybe worth considering if you want to make it more static/ssr’ed in future but by then you might consider the full next.js recode that you discussed in the vid.
For some wierd cases like mine where only /particularRoute need to work like SSR what i actually tried was hosted gatsby project in particular route of CRA project and it worked just need to handle few re-routing cases
Woah! My self esteem skyrocketed because I managed to keep up with you until the end :D Aside from that, your content is top notch, keep it coming man.