This is why you need caching
Вставка
- Опубліковано 3 гру 2024
- Be sure to checkout upstash.com/?u... if you want to set up globally enabled caching in your applications.
📘 T3 Stack Tutorial: 1017897100294....
🤖 SaaS I'm Building: www.icongenera...
💬 Discord: / discord
🔔 Newsletter: newsletter.web...
📁 GitHub: github.com/web...
📺 Twitch: / webdevcody
🤖 Website: webdevcody.com
🐦 Twitter: / webdevcody
I implemented this in my app Cody! When i logged the times using the cache, vs without, it was at least 80 percent faster. Devs like you and Josh Tried Coding are a blessing, teaching us advanced topic to implement in our projects.
Grats on sponsorship!
Legit!
You do custom thumbnails now! Super interesting topic, thanks for sharing. Got a good experience with upstash as well, very handy for serverless
I usually only do thumbnails for my sponsored videos, but I may try doing them for all videos if they start improving click through rate
Keep in mind, this is always dependent on what kind of data you deliver from database.
Example1: if you deliver a list of blog posts (which basically a new post appears once/day) .. this is perfect scenario ! But you still have to do update-hooks (for cases where a post was removed; so it does not ghost for another 24h inside the redis cache)
Exemple2: if you deliver messages from a roomchat (which basically can change < 1/s depending on how many users there are in the room) .. this is useless to cache.
Also, most modern databases already have a caching layer in place. So, IMO , for most use cases , redis is obsolete and just another thing to take care of because adds to project complexity.
If you keep this up, you're gonna become the most important tech/eng youtuber out there. No doubt. Keep it up!
thanks I appreciate it
I love that you get straight to the point
Insanely helpful man, im definately going to add redis for my saas
redis on a remote server that I need to reach out to every time my api is hit, bugs me. why?
Thank you for this great tutorial! I would like to ask how would you handle invalidation or returning the data if there's like pagination involved? Do you set the cache key as the composite of "domainId" + pagination options like limit and page? (eg: post_page1_limit25)
heya Cody! this is a good thumbnail for the video! keep up the good work!
thanks, I can make decent thumbnails, and I have a couple more coming soon to test out how they perform
I need to take a look at this. Thanks
What I don't understand is if cache exists for the user and you return that instead, at what point will they get the latest data if they are only getting the outdated cached data returned?
You can set a timer to expire it. If you need the latest data for users every request then don’t use a cache
@@WebDevCody ah that sounds interesting. Thanks that clears it up.
Hi, I need guidance around what tools and technologies are expected from a 3 years experienced frontend (react.js) developer.
I used to have Django REST with Redis caching and our Redis was making a copy of its db to a disk. And suddenly we ran out of storage and the API didn't work anymore lol
What's the benefit of using Redis instead of hacking a custom in-memory caching solution? I am working with around 24MB max updated every hour or so.
Redis allows you to have a shared caching service, meaning you can scale multiple web services behind a load balancer. In memory would limit to 1 machine
@@WebDevCody Meaning - if I have a single server serving the backend in-memory is okay, but if I use edge/multiple instances of same server then Redis is preferred, gotcha, thanks for the reply! ❤️
love your content... thank you
Hey Cody, are you going to continue the Mantine course platform series?
Absolutely, ijust needed to get a couple videos out first
@@WebDevCody Great! Will be waiting eagerly :)
I've a question, why do we need trpc or an end point and not just use prisma client directly on our server component? or do we still need to have an endpoint?
You could add your database calls directly inside your react server components if you want. I’d at least add one layer of abstraction
Good example, thanks
Hi Cody, do you have any ideas how to set the expires' value on the Javscript Fetch calls ?
It's ok, I figured it out but looks like an extra call is needed as below that sets it to 60 secs:
const response = await fetch(cacheEndpoint + "/expire/" + cacheType + '/60', {
headers: {
Authorization: "Bearer " + cacheBearer,
},
});
Is making a local cache manager based on local object or map will be a good idea? Or using redis will be better approach for the best performance?
that will work, but it will not work well when doing serverless since every lambda would has it's own cache. If you have a single VM that hosts your application, then yeah that might work, but redis has a lot of built in features that will make everything much easier as you grow
@@WebDevCody thanks!
@@shadowplay1211 In-App memory caches can get weird too. Usually when you query from a redis cache, and you make a change to the object returned, it doesn't affect the store, but in-app memory stores, e.g. an object, depending on the implementation can actually update the object in the store. Just a heads up.
thank you, mate!
Will this work this much faster in production also ?
yeah usually caching is something you'd add to a production system to try and improve load time of data that doesn't change often
This is a great video
Thanks React Query.
honestly this is seems a bit of an overkill, if you use Redis just to cache for 10 seconds, you have this already built in in the Next framework, those are called `Segment Config Options`.
Which you can cache your api request, without any 3rd party providers and without spending more money on them :)(P.S and it will be faster)
It's not like I'm against Redis, I just think the example wasn't good enough can the edge case you presented have already built it solutions.
I use Redis to cache stuff that regular API router caching can't do. Like user specific data or secret stuff
using redis gives you more control I'd say. For example, let's say you cache these comments for 5 minutes for users, BUT you want to invalidate the cache when a new user posts a comment. Correct me if I'm wrong, but if you just let next / vercel cache your api request, you have to wait the full 5 minutes and can not invalidate.
@@WebDevCody I think you’re right, great point!🙏
Very nice
Or you could cache the data on disk like a pleb and force everyone working for you to agree even though you just said that the table is going to have 100 rows at most and it's an app used by like 200 people a day max.
That works for a service deployed to a vm and not on edge or serverless
@@WebDevCody Yeah, that too
I just use redis as my primary db.
This seems less like a reason to use a cache and more of a reason to find out why your query is taking so damn long.
Yes, this was a simple example, but at some point you can't improve the query anymore. I've seen big queries with many joins take seconds before and you're left with little options for optimizing the query.
Liked the video still one thing trigged me and that was the else when it was not needed since your if always returns the else would not be needed
it'll be ok, an else statement isn't that bad
Caching is not the answer to poor database schema.
Good job babe!!!!
I Don't Understand Why You Wrote false && cache Inside An If Statement You Can Directly Write if(cache) Instead If There Is An Cache Result
Did you watch the video?
POV: me when i think im too smart
[0]