When You Give Programmers a HUGE Database
Вставка
- Опубліковано 10 чер 2024
- The first 500 people to use my link will receive a one month free trial of Skillshare: skl.sh/conaticus12231
Discord: / discord
Github: github.com/conaticus
Twitter: / conaticus
Join this channel to get access to perks:
/ @conaticus
0:00 Project Overview
0:19 Moving CSV to Postgres
1:23 Optimising Distance Calculations
2:16 Ranking Courses
4:11 Sending Mass Data to Frontend
4:38 C# API
In this video Rami and I work on an MVP. The MVP is a university finder for college students, to give them more precise results that tailor to their preferences. There is a lot of data to manage and handle efficiently and I have documented our process of optimising this. - Наука та технологія
The first 500 people to use my link will receive a one month free trial of Skillshare: skl.sh/conaticus12231
Hi, is this project in production now? I'd like to actually search courses.
Inaccurate, no thigh-high socks.
Not sure if this was considered, but the postgres COPY command would have likely made the first part of the video much easier
or asking chat gpt to make seeding script
i love your progress of never tried c# to start knowing it in a prev video to actually rating it the best backend lang and now; writing a whole API with it
Instead of dijkstra you could use A*, its easier to pronounce.
But you really dont need to do pathfinding as the nooks and crannies of roads will even out and some would take the train so it wouldnt be a good comparison anyways.
and then you end up downloading the Network Rail open data feeds to find the duration between the nearest train stations to
Its just deek-struh for pronunciation sake
@@bigerrncodes dyk-stra*
@@bigerrncodesdjeek-strah
@@bigerrncodesthe way non dutch people pronounce his name is so hilarious
Just use postgis extension to postgresql for filter/sort by geo location
Is this what qualifies as a huge database these days?
Only for the soy devs
Waiting for more! Keep up the good work, I'm curious what comes out of it.
2:10 You should pronounce it as "Daykstra" you silly Brit.
Could have loaded the data into SQLite through the CLI, cleaned it there, then moved it to Postgres. Could have even left it on SQLite since it's mostly read operations taking place on the data
For a quick a dirty way to calculate the distance you could use the openstreet/osrm open source project
2:25 What are we caching ? The data that was returned from the users query? What would the key be ?
Would we just use the request params as the cache key and cache the response to those params? Then perform the distance filtering? Havent used redis or any cache layer before so just trying to figure out what is going on here.
I think there is a new API for storing files on the users harddrive now, so you could probably send more than 20MB with ease :D
If you're going to store the database in redis, why not use persistent redis and get rid of postgress?
you should've asked me how to pronounce "dijkstra's algorithm" 😂 it's a dutch name. great video btw❤
😂thanks, i hope dijkstra doesn't see this video
@@conaticus i think it's impossible for him to see this.... 😂
@@ph03n1x_dev What happened to him? is he having problems with his UA-cam account?
I also once calculated the geocoordinate. Bruteforced to map out the house numbers to lat and long😂. LoL
For sending data to the frontend, why don't you just consider using a type of pagination meaning x entries per page, page offset by x * pagenumber something like that. You don't NEED too get all results at once, that would be horrible, but you also don't need to let that limit yourselves in showing more results.
1:14 can someone please tell if this is bad code or not? It seems alright to me
If you are using your own path-finding algs for mapping, you might as well write a separate front end for it and *boom* mapping app. That is to say, you should almost certainly not take this approach and instead leverage an existing services' API such as google map's. If you really do want to go down the route of hosting and running your own mapping algs, you're going to need far more data and compute than what you're currently working with not to mention hosting it at scale will wind up costing you a TON more than just leveraging an api (which will be far more accurate in any event).
What is the music from 1:27?
using Djnago for the backend woulda save alot IMO
HUGE database eh Con?👀
wouldn't it be easier to use elastic search.
2:11 "daixtra" or smth like that
Why, you don't live stream it
... I love your live stream
Please tell me you're using Data Transfer objects because I didn't see them anywhere.
we're using result sets which contains the data we need that way we don't have to use auto mapper or map one object to another
conaticus upload (real)
Which countries does this include?
Just the UK for the moment! That's all the dataset provides for now sadly
@@conaticus Alright
Interesting
You should check google S2 or similar lib for querying entities within the range on a map. It's crazy efficient because you store coordinates as a bigint instead of real latitude and longitude. There are shitload of underlying math, but lib pretty much abstracts you from it.
Should i build this but for Canada 😂😂
Nice
rebuild everything in rust 😅
nice
1min out of 5min video being an ad? RAGGGHHHHH
W
Idk what to comment
Why are his lips so red
No gay