Graphics engineer here. The biggest graphics sin in the gaming industry is that they are repeatedly choosing Direct X over Vulkan. WE ALREADY HAVE A CROSS PLATFORM SOLUTION THAT WILL RUN ON EVERYTHING. WHY THE FCUK ARE WE NOT USING IT ?? But on top of that the games are built to maximize monetization which just sours the gameplay. The gaming industry suffers from being money hungry like everything else in life.
Agreed, I'm honestly surprised the larger players in the gaming industry haven't forced SDL + Vulkan or GL on the console manufactures so they only have to support 1set of APIs!
Unity dev here. Vulkan backend hardly works and when it works I get slower frame timings than with DirectX. I know its unity that at fault not Vulkan devs. I really want to switch to vulkan, I really do. But we are in the hands of the engine devs sadly.
my main problem with non-remote is spending hours commuting + hours not doing stuff to get to the 8 hour mark. Meanwhile there's no commute in remote and you get to finish when you finish your tasks, not when the time runs out
I agree 100% with the commute but, even when there isn’t 8 hours of work tasks to do I usually try to spend that time in growth development. Whether that’s watching news to keep up with changes in tech landscape, running a dummy project / leetcode exercises, or RTFM or reading changelogs on some relevant tech, or preferably looking for and potentially fixing tech debt in our stack; I can usually find 8+ hours worth of stuff to do. But it’s nice to intersperse meaningful family breaks every 2-3 hours (at least with how my brain works). I’m in no way pro-office and strongly support wfh. I just think the 2 hours of work is a very weak argument and couldn’t imagine spending that little time on growth oriented tasks.
This is a very independent mentality, which might be okay, but I think hints at the problem. That time after "you" finish "your" tasks is available for collaboration, mentorship, etc
Commute is the worst, 2 hours back and forth total, basically count it as working hours that I'm not getting paid for so I've just begun using the commute to think and sketch out the problems I have and just log them as working hours. Allows me to justify arriving 30 minutes late and leaving 30 minutes early some days, not that it's particularly strict where I work. Haven't worked from home yet, basically my fourth week at this job and it's easier to figure things out when I can ask people in person, but my god would it be so much nicer to just cut out that 2 hours for absolutely anything else, like more sleep.
computers keep getting faster but software stays the same speed eventually someone's going to write a required piece of code for the modern software stack that runs inside of a minecraft instance in redstone and it will be running on a bank server
Not really. What happened is that the slow, complex software became faster, and replaced all of the simple fast software. So what you have are complex applications running at normal speed. Take graphics as an example. In the early 90s, something like Blender would've been run on timesharing graphics servers costing upwards of 2,000,000 USD. Rendering would've taken several days per second of film and cost hundreds of dollars in energy usage. Today you can render a scene in Blender in literal minutes. Higher end computers can do it in a matter of seconds. Long, multi-day renders will render scenes that are more complex than entire 3d animated movies from the early 2000s. A 2,000,000 USD setup could render a movie like Soul or Coco at the same speed it took for a 1990s setup to render Toy Story. A modern home computer can render the entirety of Toy Story in real time, and at 60 frames per second. In 1980, it took minutes to render a single frame of a mandelbrot set. Now you can render a mandelbrot set in real time.
It's so weird hearing people hate on WFH, I get so much more done at home it's not even close. That + commutes and it really is such a big qol improvement to working exclusively at the office that I'm very glad our company has been mostly remote ever since the pandemic.
It depends on what you do. In game dev you usually have a ton of work and need very tight communication to make things well. WFH really makes this harder. If you're at an average job it doesn't really impact performance.
@@La0bouchere Ehh, I don’t see how the same doesn’t apply to other large software projects. It’s very hard to develop some legacy systems in tandem and the same goes for new development too.
The netflix comparison is specious. Paying for a service is not the same as paying for a physical thing that the company then won't let you use except in the ways they explicitly approve. If anything, it comes down to a similar argument as the right-to-repair movement. If you want to provide some sort of cloud GPU service with a proprietary, locked-down interface, go for it, but if I pay for a physical object that I now own, I should be able to use it when I want, how I want, and anything less is unethical.
One company mixing hardware and software is unethical. It's either you sell the hardware and the software is free/open or you sell the software separately, interfacing with the hardware the way any 3rd party developer would have access. The apple model is how things got so bad. Tech has probably been held back at least a century because of it, if we assume we missed out on exponential progress.
I’ll say this about in person meetings. I don’t think the ideas are necessarily any better. Having immediate feedback on them makes a stronger emotional connection to them and therefore they will feel much more significant than ideas that are had in your home office. The added facial response and non-muted audio response will further increase that as well.
@@jordixboyIf that's your attitude then your opinion also kind of doesn't matter. You will always pick whatever means the least amount of work for the most amount of money. Essentially, it means you are indifferent to the effects on your job as long as it doesn't affect you, which is the reason why your opinion doesn't matter.
@@CottidaeSEA ,y opinion matters equally as yours. Never said I will do the least amount of work. Everyone has a different opinion, all matter, or none matter.
@@jordixboy Did your reading comprehension stop at first grade? There is a reason why your opinion does not matter and it is indifference. Unless you prove you are not indifferent, your opinion will always be one that is beneficial to yourself regardless of the cost to your company. That is not a valuable opinion in this. If you have an actual valid opinion, fair, but I'd still invalidate it on the basis of prior statements.
@@CottidaeSEA Is your reading comprehension low? Everybodies opinion is different, equally valuable or not equally valuable. Your opinion is that, cool I respect it, my opinion is that your opinion sucks and is not valuable at all.
My main problem with non-remote is that unless you are a contractor, no matter how efficient you are, if you finish you work in 50% of time, your boss will just find something more for to you to fill in.
Yeah fuck, the "how many of you have more microservices than customers" hits a little too close to home. It's absolutely ridiculous how slow and bottom up everything has been implemented in our project and it still sucks. Like we've developed abstractions on abstractions and we're still not even done with the core feature set....
The Pizza Hut website has a loading bar. It takes a couple of seconds to render a page for a pizza, when the combined dataset of all possible pizzas and locations likely fits in a modern CPU cache already. What the heck is it doing for two seconds?
@@MatthewSwabey I’ve heard that a lot of websites actually have spinners and what not to show the user that the program is ”working”. Apparently some users get confused if things just work quickly. If we’re being honest though, that website client that’s full of complex math (for some important reason I’m sure) is querying a bloated service that’s integrated with five more legacy services that are slow as time lmao.
@@MatthewSwabeymy local cinema mobile app takes multiple seconds (2-5!) to figure out what cinemas exist. Bro, there's like four, you build them once a decade at most, I think you can send an app update if there's a new one.
Hot take: people care about max productivity too much. Yes in-person meetings work better but who gives a fluck. Remote has the potential to make everybody's out-of-work life better globally and that's a more important goal. We're not there yet, because cities are still 100% structured around those mfing offices but we'll get there someday.
in person for extroverts maybe. I cant retain or process any info during an in person meeting, I have to write everything down then read it later. So at least for me, meeting are 100% a waste of time.
It's not productivity imho, it's creativity that remote just absolutely destroys. Even as a shy and introverted person in person game dev is so much more creative and I'd maybe even say less productive. In person is so much easier to bounce around ideas and gauge people's reactions/emotions to things during design and at later stages when intermingling gameplay code and assets. I can only speak from smaller studio sizes but my knowledge of the big places is still similar just not as much, but again things like gameplay code has people talking and gauging each other in quick succession in person that brings freshness to it. As for standard SWE, yeah in person means fuck all.
This is a sane take. I work remote for 11 years now, and in my observation, the ones who think remote is worse are the people who worked in an office previously. The transition is hard for them and they want a deeper level of communication which in my experience does not achieve anything work-related. It is just more pleasant. Sure, you can go over to someone’s desk and start talking which is easier than scheduling a call in a [MESSENGER], which is objectively faster thus more productive, but there is a solution to that to: Discord like voice channels for the duration of the work day. Hop in and start talking. Also, I’ve been able to work with companies all over the globe. Some of those companies were able to reduce their costs by employing me instead of local developers, if those were available at all. So, I won’t be harming myself and making my life more difficult by “maximizing productivity” unless it gives me a competitive advantage somehow. I prefer to work and deliver, instead of being good loyal company bot who lives in the office.
Right? Some people are way too numbers-driven without putting a fair share of focus on workers. Just reasons, logic, data… Like, to some people that’s all that matters at the end of the day. Not all the external variables that influence those numbers
"Official" meetings -remote or in person - suck. Most of them end in "we need to do more research" or similiar "non conclusion". The most productive meeting were always ad hoc - someone come to chat about a problem, some overheard and chimed in, someone working on wholly different issue recognized that both have common solution etc. 15 minutes banter around water cooler may be worth 3 hours of power point presentations...
I somewhat agree about the remote thing. The reality is that I love remote because it enables me to have a better work life balance, for example, I can do my laundry, walk my dog, cook myself lunch, etc. However, as a junior, sometimes it's hard to know if I'm doing things correctly. Simply because I don't get those same social cues. Luckily, I'm on a great team that has that culture of open door communication. But even then, it's annoying to have to slack someone, set up a meeting, remind them that you have a meeting with them - especially when you just have a simple question. I love remote, but def be prepared to go the extra mile when it comes to communicating to your team.
If it's a simple question you can just ask the question instead of setting up a meeting though. There really is no difference between showing up at someones desk vs showing up in their PMs, asking them a question vs asking them a question, them giving you the answer right away/"Give me a couple to wrap up this thing first" vs them giving you the answer right way/"Give me a couple to wrap up this thing first" except that it gives FAR more flexibility if you ask a team instead of a single person and first that has time answers, and even working accross time zones.
@@TurtleKwitty This. 98% of my questions are asked in group channels and then if I feel like one person might have more insight I message them directly, or if we've talked about it before. If I get no answers then I get my team lead involved because stuff shouldn't go unanswered. dms and setting up a meeting are not the way to go for a 'simple' question. If it requires a meeting, then it wasn't so simple which means that you're probably doing pretty well? Having it in DMs and in a meeting also means that the next person to have to figure it out won't have the info available, since it's not in slack or in docs (because how often do we really update docs afterwards)
I don't get this, do people just walk up to other peoples' desks when they're doing something ? Even before pandemic times, we would check if someone was available on communicator/skype/teams/whatever before going to bother them. WFH is the same process except I don't have to walk anywhere and its just a quick teams call that lasts less than 5 minutes
The point hes missing here with the whole "consumers choice" debate is that its not about the "government deciding what phone you can buy" or whatever but its about how Apple, Microsoft, Google, Amazon etc. are all trying to lock down their ecosystems by removing more and more points of interoperability to the point where as a consumer i am basically forced to buy (for example) a mac if i want my Iphone to interact with my computer. The worst thing about this is that this forces out any potential competitors and is really bad for consumer choice in the long run. Imagine if Apple was forced to open up their ecosystem so you could have the same experience with connecting your iphone to a Windows or even Linux system that you would have with a mac.
I agree. His argument on that point was very weak. The purchasing power of the consumer is just not that influential. Companies do not take a purchase as some kind of vote or intention. The profit incentive of a company can and often does diverge from their own customer base all-the-time. Also, we don’t live in a dictatorship. It’s our representatives who regulate markets. And the government regulates all kinds of markets significantly for the benefit of the consumer, mainly safety. I see no reason why tech should be considered somehow more special than construction, rail, food industry, etc. Consumers can just never be educated enough to know whether a given product is safe or even better than a different product. In such cases, a government action is the consumers’ only recourse. It’s either that or boycott, which is the hardest thing to pull off of all these actions. There’s just not another option in our society. And what choice am I losing in this scenario? I am an iPhone and Mac user. So the government forces Apple to open their platform. What have I lost? Is Apple just gonna call it quits and stop integrating their software and hardware across their ecosystem? I think that’s kind of preposterous. It’s not like Apple is about to go out of business. Clearly Apple will continue to make their products, and I will just *also* have other options that I did not have before. This is definitely a case where more competition is good. And then the conflation of government regulation with a government TAKEOVER of Apple is just beyond bonkers. Like, the government regulates virtually every industry in the country in some way. Do we really want drug companies to just release anything they want, even if it kills people? Is Apple suffering now that the government is forcing them to abandon their proprietary Lightning port for the standardized USB-C? The government is very good at regulation actually and has specifically forced standardization into many industries before and it’s often for the best. We’ve literally never had an unregulated economy in American history.
Indeed! But then it just shows that no one is an expert in everything and we must always be vigilant to remain open minded and not let bias blind us to jump to the "feel good" point of the discourse or else your just adopting someone else's rhetoric.
@@A_Dylan_of_No_Renown while I personally see the force to use USB c on Apple phones as better for the world... We also have to look at who makes money from the USBC standard and who doesn't. Though I imagine the whole things mute cuz apples reason for going lighting bolt in the first place was an attempt to monopolize that space as well as all the other areas of the tech stack it dominates. (Now to go look how open source the USB standards are and if there are any patents still active.) Ah, the rabbit hole winds ever deeper.
@@larkohiya Lightning exists because Apple wanted a smaller connector to 30 pin that was reversible and more durable than micro B. USB C came two years after lightning and was primarily developed by Apple and Intel for Thunderbolt 3 and proposed to USB-IF so that they could support USB on the connector and use it to standardize on a single port for laptops. Hence why for a while Macbooks had only type C ports and Intel motherboards with certain chipsets had Type C as standard before it took off.
As much as it's also their job to make a good software, a good software is only good if accepted by users. How many open source packages have all the technical detailed documentation you can think of, and never ever or only vaguely a good user's manual? How many packages I have made to run after whatever absurd procedure was required for it, and not being able to do anything on the running software because you don't know what to do. What is the process and what is the problem this software is trying to solve are the 2 questions a developer has to keep in mind. Not just implementing stuff because there is a library for it.
@@funicon3689 But you have to have paying customers, before you can start cutting corner and make the big buck, as scale economy is always increasing then decreasing.
Programmers only get so much input into whether the game is good or not. Plenty of terrible games out there with great code (and vice versa, come to that). The designers have overwhelmingly more influence on the quality of the game than the programmers do.
Well, that's more the job of the game designer. The job of a game programmer is the job of any programmer, which is to write good, predictable software. The programmer could do everything right, and the game would still suck if the designer is an idiot.
Funny thing. I am literally overseas right now meeting our remote team. There is something about the in person engagement and the mindset it brings. Ideas seem to bloom and riff in ways that are harder remote. The feeling of epiphany when things just bounce in a room when just talking about an issue with a loose agenda is something that happens rarely remote.
@@akr4s1a Because Unity only supports volumetric lighting in it's HDRP renderer which isn't designed for games, it's designed for animation, tech demos and product promos (ie: Most cars in car commercials are actually 3D animation) and as a result has awful performance. If you made the same game in Godot or Unreal (engines that built proper volumetric lighting systems for themselves), it would probably run a lot better. Which just reinforces the video's main point that video game developers are drowning in needless complications. The fact that Zeekers needed to use a specific renderer that's not meant for games and do a bunch of hacks to make it run decently attests to this.
@@kaijuultimax9407 The thing is that the fun complication of the volumetric shading on unity ended up driving a creative aspect of the visual design of the game. Having no complications is bad for art, art need to be boxed for creativity to happen.
I graduated in 2022 so I've only ever worked from home, and all but 1 of the peers in my team are on a different continent. Luckily most of my work isn't collaborative, but when we are developing a feature or fixing bugs together, I feel like things could move a lot quicker in person than they do over Slack and Zoom.
You'd think but there's a lot of nonsense involved in setting up and starting/finishing IRL meetings that gets glossed over when people are talking about it Contra Prime I've never had good/great ideas at meetings it's always been elsewhere
Blaming WFH is a cop out, most games are ruined at executive level by people involved in the game design who have no idea about gaming and whose primary focus is business-driven. We see it all the time, Creative Assembly recently has had a major shitshow and we found out from leaks executives have been prioritising shitting out DLCs and sequels for Total War over addressing tech debt and interfering with game design by "brand managers" overriding actual game designers to maximise short term profits. CoD wouldn't be amazing suddenly if Infinity Ward devs were working 100% in office 12h days because fundamentally C Suite has made the decision to rush MW3 and make it a copy paste of MW2 with Warzone maps as campaign maps to save on development costs. You think passionate game devs made those decisions? Grow up.
"Best form of communication is in-person" True, but comes with tradeoffs: - gets distracted by boomer who doesn't keep their phone on silent - 2hr daily commute - Can't effectively buffer/batch communication (25 minutes of work, 5 minutes of answering messages, repeat) because someone comes to my desk every 5-10 minutes and interrupts me
" Can't effectively buffer/batch communication " When I used to work in person I would basically block all communication for the last 4 hours of the day by scheduling a meeting alone with myself for the first four hours, so I could actually get things done. I also had a goods-man agreement with my coworkers we don't interrupt ourselves for the first hours so work can be done.
Depends on where you work, I guess. I'm working at a game dev startup with some friends and I've noticed we're a lot better at communicating in person than over discord/slack. Hard to keep focused too when you work, relax, eat, and (for some) sleep in the same room all day. Really easy for some to avoid work, get distracted, or a myriad of other shit compared to being in an 'office' with the boys and just hammering out code. Not to mention how much of a godsend it can be to have someone else take a look at bugs. The amount of times one of us has caught a bug in minutes that the code author would have spent hours looking for has eliminated so much time that would have been wasted trying to track down a bug on your own.
@@michaeltokarev1946 You can share your screen and even control using many tools, eg Microsoft Teams (use the "New" one, "Work" edition, probably, and reboot when it keeps crashing).
I work remotely, Teams and Slack create more interruptions than I ever needed. Can't drop them currently because of the support requirement of my role. While in the office, people saw the headphones, and left me alone.
face to face produces more synchronicity in micro-movements which emotionally uplifts ppl and gets them into flow easier. It's something that happens in concerts, when you get emotional in (home)cinema close to someone and when you look at some exciting stuff in the same room and discuss it. Read it in the book Interaction Ritual Chains and probably messed up the summary quite a bit.
I agree that in-person communication is the most clear, but it comes at a very high cost. Commuting, geographic restrictions on hiring, and in-office distractions all make employees way less productive at the office. On the other hand, there are also at-home distractions and chores that reduce productivity at home.
Efficiency is sort of irrelevant, just set pay for Endprodukt based goals/tasks You can circumvent worker rights by paying minimum with most of the actual pay locked behind goals, this also means work time depends on how fast the employee is (which can obviously be positive or negative for you)
The dichotomy isn't between consumers and governments though, it's corporations vs government currently, and even that isn't really true since what we actually want is companies to be *somewhat* interoperable and "the government" doesn't want this job.
last two days I spent configuring Unity's camera controller called Cinemachine. I had to give up. No matter how you configure it, there is always something that is breaking the experience. The whole time I worked on it I felt like Jonathan Blow is breathing behind my back and judging me for it. In the end I coded my own camera controller that does exactly what I want in 30 minutes. The more I code the more I understand what Jonathan is talking about.
I've worked remote for 20+ years. Most people can't do it well in my experience. It's easier in some ways, but perhaps harder in more ways. I agree effective communication and collaboration are MUCH easier when you're in a room with someone. I love going into the office when I can. I'd like to try a combo of in-person and remote work, but haven't had that luxury yet.
The open-up comparison with Netflix is flawed. The problem with mobile is that there isn't really much of a choice. The moat to create alternative options is just indefinitely wider than creating another recommendation engine. Both Google and Apple are providing services that are essentially system relevant. They provide world infrastructure. That's why they need to be held to a different standard. I am not a fan regulating companies - but for markets to work their magic we need more than two options.
Apropos simple vs hard problems - I can highly recommend any and every developer take a good look at the Cynefin sense-making framework. Understanding the difference between clear, complicated and complex problems, and which problem solving strategy lends itself to which problem space, has proven invaluable to me.
Things like the walled garden of Apple are a major problem for tech. It is textbook anti-competitive behaviour and they should get in trouble with anti-trust.
I worked two internships in school so far, one was remote and the other was in person. I will say that it was a lot easier to work iteratively with my team as an intern when we were in person. The disclaimer is that what an intern needs from their team to be good vs what a normal dev needs are different. I found it annoying and tedious to get any help from my remote team because getting them on call was a little bit of a struggle when the issue would take like 5 minutes on some domain specific thing there isn't documentation on.
4:10 that is such an important truth, after learning this after years into my career I was able to get promotions and get any position I wanted, before that I was obsessing about the small technical details management couldn't care less about.
If you're looking for just an interface to the GPU, wgpu is probably the best option. It's a 2 layer library where the backend is DX12, Metal, Vulkan, or WebGPU (in WASM), and the frontend is always WebGPU. wgpu is also what FireFox uses for its WebGPU implementation, but its WebGPU backend works on Dawn (Chrome's WebGPU implementation), as well. Basically, you have 1 API that works everywhere. You can use if from almost any language (C ABI), it compiles its shader language to the backend's shader language, and it's written in Rust. The alternatives that still has that much portability are game engine abstraction layers, but many game engines are using wgpu as their own abstraction layer, now.
Main problem was the shader language was changing every ten days while they were specifying it so their compiler is still *really* rough. I think they *still* won't actually tell you what expression/type was wrong in errors, just give a random internal id?
@@jaysistar2711 I might be using the error reporting wrong, or it may well have been fixed (it was a while ago), but it wasn't a case of "often" - every error message that referenced something in the source code used an indirection that Error::display couldn't resolve. "The expression [6] of type [16] is not assignable to type [43]" was my experience for the whole time I used it!
Game devs? I'm drowning as a fronted developer, working with framework... to a framework... to a framework. And I wish I was exaggerating. I miss ye old days writing pure JS.
@@anj000 No one is forcing you to use the latest unstable unproven framework. Just stick with something and get good at it, even if that's PHP. Modern web development is a made up problem to make an easy problem (making a web site) seem incredibly complex.
@@tapwater424 also no one is forcing me to eat and have a place to live. You are right. But please note that I have a job, and having this job is predicted on the fact of working with this abstract framework.
28:30 This is a false dichotomy. It's not about government control telling companies what they have to use, it's about whether technology can impose artificial scarcity. The consumer doesn't have any control over what Apple or the market does. It's like saying that market forces can counter planned obsolence. Imagine a wrench that can work on any nut/bolt but it has an rfid reader that only allows it to work with proprietary hardware. You would think that the market would select against that but it can't if every actor behaved that way. It sounds dumb with hardware but software makes that kind of stupidity possible.
The one part of this equation that I'd rather the government control is internet services. The alternative is pitiful in how often it causes consumer grief. At&t literally let a fifth of their national user data get breached and didn't tell anyone until that data was made free for anyone to steal.
4:00 I saw this in another clip and a lot of the comments pointed out that people in the office do watch each others hours and pettily focus on effort put rather than results. I'd like to add on that while you can't control others and 'nobody cares' is technically wrong, you still shouldn't care. Results are more useful to you, your employer, the world and whoever else might matter than your efforts and pain are. If people insist on you working and hanging around until you are tired regardless of how effective this is, placate them if you must, tell them to go f themselves if you can, but don't let it stop you from getting results in the easiest way possible.
I had so many fights over the 9-5 schedule in my last job, I kept not respecting the clock because I cared about results not being on time for the morning coffee. I am like a wizard, A wizard is never late, nor is he early, he arrives precisely when he means to. And like Gandalf I could tank bosses and get all the XP too.
IMO the walled garden thing *IS* a false dichotomy. The options aren't just - 1. companies create their own walled gardens where people can't even make dev tools because the instruction set is hidden OR 2. government forces an ISA on everyone We do have the 3rd option that is literally being used right now for almost all CPU architectures out there. An ISA that is well documented that everyone can develop for! We need that for GPUs. We need GPU vendors to be publishing massive 100 page documents for their products just like Intel does for their CPUs. If they could even include super-in-depth stuff like the number of cycles taken for each operation then that's even better.
Every time I compare game development with web development, I realize how little value web development seems to have. The gaming industry is massive, with an absurd amount of money flowing through it, including what I've spent personally. On the other hand, besides Netflix, I can hardly think of any websites I've paid directly for. It seems like much of the web is just one big advertising platform, but how valuable is advertising really? Data collection feels like nothing more than a tool to serve more ads. What I'm getting at is whether the poor quality of web development tools reflects the lack of real money in this industry (unless you're basically an advertising company pretending to be a tech firm like Google).
The value of the web is to be the collection pool of data for our future dystopian surveillance ridden, AI policed and controlled society. If people had to pay for this themselves, they might put more thought into it, and alter their behaviour. It’s free, specifically because the outcomes will nefariously benefit forces which you don’t want empowered. And I don’t think there’s a few master puppets pulling strings behind the scenes. This system has a behaviour that supersedes the sum of its parts… Buckminister Fuller’s synergetics somewhat touches on these themes. We’re in the early stages of machines agriculturalising humans ✌️
The global e-commerce market that the web supports is massive, we are talking trillions of USD each year. It dwarfs the global video game market. Saying web dev tools are poor due to a lack of "real money" in the web is a very strange viewpoint.
@@tom_marsden You just said yourself "that the web SUPPORTS" it is not the web ITSELF. Youre describing a sales market between people and businesses not the website itself. Not to mention, basically one website owns that entire market, or rather supports it.
Given that Terry Davis's goal was to make a modern and more robust implementation of a Commodore 64, which according to a lot of bedroom coding Gen X'ers, was one of the most fun to develop software (especially games) for; I'd say he accomplished a ton, including the fun and simple development environment.
There is value in face to face interactions. Whiteboard meetings allow people to brainstorm and set a direction as a group. However, those days are dead because no one grabs a room anymore, it is a Zoom or Teams call. So if it takes me 45 minutes to drive 6 miles, to park 300 ft from the building, to sit in a cube and be on a Zoom or Teams call, then why do we have to be in the office? Management killed the creative vibe 4 years before the pandemic. Being remote for 2 years just solidified that just as much work can be done remotely, because the way we work never changed. Oh, and let's mention the office distractions from cube congregations, poor climate control, bad hygiene and bad lunch smells. I don't have that worry at home. For folks who hate their family or don't have one and need daily human contact, or believe that the only way to get promoted is to be in your manager's face all day; let them go into the office. Employers should offer choice for knowledge workers.
I've already seen work from home affect me: it took me 9 months of straight looking to land a job and just because I do not want to work from home when I am still figuring this entire thing out, I can't comment on that perspective, but my colleagues working from home affect me in the sense that if I have to ask a quick question, I am shit out of luck because the people I work with aren't particularly prone to using a communications platform in any meaningful capacity. If I worked with a bunch of mid 20 year olds instead of more mid 30 year olds, this wouldn't be an issue to this degree and as my generation matures, we'll bring this more rapid remote work in. As for finishing stuff vs effort, yeah, I've had two tasks that I've taken 2 weeks to complete (among other tasks) and I've actually felt god awful for delivering such simple stuff so slow, but... power apps is something else, it's absolutely geriatric microsoft bullshit. And building up debt... tech debt, project debt, design debt, jesus christ, let's just say that what project I am apart of now has been royally fucked because of an inexperienced team, but like, inexperienced not in the environment but just generally inexperienced. The codebase is all sorts of fucked, the structure is all over the place even with a design document, hell, the design is all over the place, you want to do anything, you'll be sifting through 1000-3000 line files with barely any documentation without source control, you have no ERD to visualize the database, there's no real documentation of any kind... and I am sat here like "sure, I could spend the next few weeks fixing ALL of this, but I would be delivering nothing in the meanwhile and thus I'd be working hard but delivering nothing." Just crazy how as much as it'd create a result that's going to make all of our life easier, I'd be essentially doing work for no real end result for the end user past "well, we might be able to develop slightly quicker possibly."
10:15, I would say it's a lack of chaos, conversations tend to be a form of chaotic order. WFH devs tend to have a an organised environment to some extent which does not cause the brain to get distracted enough to break out of whatever box it's in. In others words the right amount of chaos can cause a dev to think more outside the box then they normally would.
people saying WFH is at fault are just coping after releasing a bad game. I wfh for 2 years and absolutely kicked ass on the AAA game I worked on. People just need to not suck
From a consumer standpoint, there's a couple stuff that's invisible to us. Everyone says games have grown in complexity, but from our perspective, games are simpler than ever. Shallow pieces of overly-gated and on-rails "bottled" experiences. Plus it's really hard to agree when every new title is made in the same engine with as little effort into detailing as possible (look up Batman Arkham Asylum vs the new SuicideSquad game). Lastly, every-new-title also uses the latest 2 marketing solutions: DLSS and RTX. Games look like ass but "b-but the rays of light are casted real time and the resolution is upscaled to 4k". When gamers see gamedevs consistently crying for what's about 10 years now about their work being hard and videogames getting infinitely complex, and wanting to raise the price... but the games are actually worse, yeah, people will laugh at them.
Go play games from 10 years ago, on average the games will be worse you just remember the experience. Typically we will remember the good experiences .... But the game itself is likely to be worse.
The other thing about WFH is that we have developed our "office culture" for decades, with many failures along the way. But people have given WFH a real chance for a couple of years and they are "completely sure that it doesn't work". It will work, if we want it to work. People will adapt. Calling someone on Teams without a meeting scheduled will be the equivalent of knocking on someone's door, for example. It is for me ridiculous to say that WFH does not work after such a short trial. It is like saying that computers are actually useless because they are super expensive and occupy a room, and Joe down the corridor can do math operations faster than all those spinning disks. For a very long time, computers were seen as a niche to be used in certain applications like military decoding, but nobody will use a computer to do X, because we do X now in this other way, and as you can see, it is so much better and has worked forever, and I cannot imagine myself using a computer to do X.
I am working in gamedev with cocos creator 3 which is open source. And one thing in that engine bugged me really hard, so i went ahead and forked it, change the implementation of some element, extended it's features and used it in my game. Felt really empowering ngl. I would cry if i had to do it with something as complicated as react
"Only thing companies should optimize for is making money" is same as saying "Paperclip AI should only optimize for making paperclips". Those who know, know.
I think the IRL vs remote argument simply boils down to the assumption that people must be self-managed when working remotely. If you're in person, you and your mind are in work mode. Or at the very least, seem-to-be-working mode. But even with a total refusal to work, you're doing it from the restraints of the office. So you're passing time in a restricted area. Versus passing time in the comforts of your own home. There's an obvious case to be made for people who just don't manage their own time very well being impacted more from remote work, but even in the case of perfect time management, the environments alone have to make some sort of impact on your willingness and desire levels to do different activities. What you can not currently do, you more easily quell the desire for. But if you're now home and that desire arises and you can get away with satisfying it, well now you're off to the races.
Jonathan Blow suffers from, what I like to call, "programmer isolation syndrome". He's basically put his whole life into his own projects, and doesn't really work with or collaborate with other people. Thekla is more or less his own thing, and the few employees he has are both unnamed and temporary. So he kind of just exists in his own bubble. Inside of that bubble is some good wisdom on programming and advice on undertaking difficult projects like games. The thing that he doesn't have inside of his bubble is compromise. Or really any good wisdom on engineering as a whole. His positions on software security are laughably outdated, and remind me of concerns that people had about ARPAnet during the Cold War. He doesn't understand that so much of engineering as a whole is compromising, either having to compromise with other people, other ideas, or even objectively bad ideas. What makes a good engineer is that ability to compromise effectively. A master engineer can take a worthless piece of junk and make it do something useful. Blow isn't really an engineer. I see him more or less as an 'artisan'. If he doesn't like something, he doesn't try to fix it, or work around it, or integrate it into something better. Instead, he starts over from scratch with something that works. While this might be good for 'artisanal' projects, like indie games or custom programming languages, it doesn't work for other kinds of projects. I think people shouldn't take Blow's words at face value. As a software programmer, you are first and foremost an engineer, not an artisan. Becoming an artisan comes after you've gotten the engineering part down.
The way you're talking, is that you're saying an engineer compromises on things. The thing about making compromises is it harms the integrity and quality of the engineering you're doing. You're spot on about him being artisanal, as an artisan desires artisan-level quality for their work. Compromised software development, is why you get your accounts compromised, as the garbage. Additionally, compromises harm the integrity of the vision. This is especially important if the vision is sensitive to small changes. > He's basically put his whole life into his own projects, and doesn't really work with or collaborate with other people. Thekla is more or less his own thing, and the few employees he has are both unnamed and temporary. Untrue. He collaborates a lot with people. If you've heard of Casey, who also goes by Molly Rocket on here, he's worked with him a lot. Additionally, you're falling into the trap of attributing everything to one man. This is like saying Konami is all Hideo Kojima, or Nintendo is all Shigeru Miyamoto or even that Microsoft is all Bill Gates.
5:00 Yeah, had this fun twice in my first "real job" company. Taken over a couple of code bases which were "continuously developed" by "senior developers" for months, with not even getting close to the end. Ended up throwing away of lots of overcomplicated code (YAGNI, overengineered for no other purpose than certain DRYing obsession). And finished it all in a few weeks, where others wasted man-months before.
Roblox is a pretty intuitive environment. Lua was my very first exposure to programming and now that I’m actually a developer I’d love to actually learn the language. When I compare developing Roblox games to the learning curve of Unity, Unreal, Godot etc it feels a lot more straightforward for beginners. I’m sure it has more limitations tho.
at the 4:55 mark: Maybe it's weird, but I actually kind of enjoy the fact that people really don't care how hard you work on something. That goes both ways, ya know? If you can deliver something people enjoy with a low effort to reward ratio, it can be a win-win. AND while it kind of sucks, it rewards one's ability to assess what people at large might enjoy, instead of a personal project that might not be as well received. trade-offs.
2:25 I feel personally attacked after setting up a docker compose with multiple services for monitoring a single app to send a couple SMSes everyday for a colleague (don't worry I'm gonna split the app into multiple micro-services 😏)
I used 8 containers just to create a multi-channel messaging "app". I had backend, frontend-serverside-render, message-bus, certificate-gen, reverse-proxy, utility-executor, static-page-server, container-signaling-glue-logic . And those were the bare minimum, they weren't micro-services at all, just separations for technical reasons (for ex, the base of the "distribution" of the container being different, or container being different processes made in different languages).
I just built a dynamic IP updater using docker, lambda, and route53 (all custom script) so that I can access home assistant from the outside Internet. I definitely have more services than users and I regret nothing.
I think the good ideas coming from in-person meetings comes from the fact that while you're telling another person what you have planned, you can far more easily work out just how good of an idea your current ideas are, and your brain may also be working to improve on those ideas since the act of explaining an idea is like reading the recipe of said idea. With enough experience in cooking various things, you can generally understand where a recipe is going and what it might look like just by looking at its ingredients and instructions. I think software/game dev is generally the same idea, if you explain to another the ideas the game/program will run with, your brain will often deconstruct it into what the outcome may be, and either you, or someone else can suggest an alternative that'll supercede the original idea since it's just better and why not just use the better idea after all. I dunno, I'm not in the programming phase of being a developer yet, trying, but I like to think about these things a lot.
What you just wrote is more about the benefit of describing your ideas to other people is beneficial not why in person is beneficial. That implies 1) you have good people reading skills 2) the people you are working with gives visible responses. Even with those assumption it is still mostly about the quality of the people you work with and not in person or remote.
I find ironic that Blow starts with the same old "things were simpler back then" argument. Yes, things were simpler back then because a good portion of the languages you had to use were complicated to the point only a small number of people could do those complex things. It's still complication but in a different place. It's sad when people forget to look at things from the past in the context of that time period and always default to the usual "the grass was greener" BS. And it still becomes complicated because programming is not the collaborative effort some people make you believe, but rather this group of people trying to impress themselves with their own intellect by implementing impractical solutions that don't work.
So, you're trying to say that in the past developers were competent, but now it is a beautiful egalitarian field where complexity is removed from the developers education and manifests now in the overengineered crap those egalitarian uneducated developers create? Fully agree.
@@vitalyl1327 More or less. A good example of that was when people had all their panties in a bunch when VHDL files from AMD leaked when the vast majority of developers don't even understand logic at programming level and somehow they're gonna magically understand logic at component level.
It's more complicated because the scope of projects and requirements have risen in the past decades at a high velocity. We've built abstractions upon abstractions in order to keep up with this scope creep, so now it's much harder to reason about a system because a lot of its complexity is abstracted away from you. It's a simple scale issue, do anything easy and it's still gonna be simple to solve. Do anything hard and the complexity will make you feel dizzy and ready to reach for those abstractions that have their own set of trade offs, but they allow you to move at a much faster rate.
On the topic about productivity in a WFH setting; I am happy to say that I've worked both in 3D graphics and web/cloud development. I have not worked on games per se, but on the engine team of a big in-car navigation provider. My productivity when working on the engine team would skyrocket at the office. When the job required very technical or math inclined knowledge, even writing with a pen on paper would yield better results for me in the office because I would have people to bounce ideas or brainstorm with. The knowledge sharing was insane in an offce environment, and I really needed it. When working in web/cloud development, I didn't have this need for knowledge sharing (but I do sometimes miss being at the office for the coffees and cigarette breaks). If I code for 3 hours a day I can say that it's been a long day even though I complete everything on time. If I had been working at the office I would probably be even faster because I wouldn't stand up every half hour to look out the window or make a trip to the kitchen, and when I'd be done with my work I'd go outside the scope of the sprint and finish other things as well. Anyway, I feel like game programming is closer to the first scenario, the insides of the industry and the random knowledge sharing that happens on-site really boosts you. I also feel like that in game development, the company culture is also very important. Imagine working on GTA and programming the car bouncing when the player hooks up with a prostitute. Doing that at home is cool, but doing it in the office with your slightly perverted coworker standing next to you laughing his ass off is amazing.
Game devs nowadays don't give a fuck how Vulkan or OpenGL works, they don't even care how the CPU or the OS works, and I'm not convinced that is bad thing, most people are just fine learning how to work within their game engine, which usually is one of the 'off-the-shelf' ones like Unity, Unreal or Godot.
Correct. Idgaf how Vulkan works and if a single test fails when we flip the V switch then it goes back in the drawer. You used to put 10,000 hours in to achieve mastery in game dev. Now you'll need twice that to master tweaking foliage parameters.
honestly how all fields work, as the tech increases, the people who know enough to really know the nitty gritty become far and few between. As an amateur game dev, the software I currently use is... unity, visual studio(c#), vscode(json), photoshop, maya (modeling, UVs, skinning, rigging), substance painter, zbrush, git, and within unity there's skills to consider like shader and vfx creation, which could be their own entire programs. And I'm probably missing something. I really don't need more on my plate, give me an engine that works out of the box and I'll be happy. The amount you need to know for a basic low fidelity 3D project is insane, I really pity anyone venturing into this who didn't learn this stuff as the industry progressed, this is why asset stores are so popular
Well, duhh. They are focused on actually making the games. A company like Dice had a whole team exclusively for developing the Frostbite engine. The game devs themselves focused on making the actual game. Frostbite is an exceedingly complex engine designed to be the workhorse for dozens of different games, so having its own team makes sense.
I go to a game programming college and im in my second year/3. most people do genuinely like to stay on just using engines and try to make games on them. Its fun and there’s always the hope of your game popping off. but there are still people who love learning all the low level details. I will be honest in my courses about openGL we barley learned anything, we are basically given a framework and learn how to play around in the framework without really writing much openGL code, I know next year I have a vulkan course and I will see how that goes. but most of my openGL and valuable c++ learning came from self learning rather than school, I really had to go out of my way to learns some of these stuff an was shocked that there are so many concepts we never even discussed in class. I think its mainly the reason a lot of people never get into it. Personally though i love working on everything lmao, lately I have been getting into openGL a lot more though as I got very comfortable with unity and unreal. I will try to “master” openGL before i get to my vulkan course so im ready for it then. but people who want to get engine programming jobs will have to know these stuff to heart, most people who stick to engines will probably be gameplay programmers and similar roles which are completely perfect and fine to do.
I understand the idea of hard work not being the goal in itself, and I agree, but I also think that results are also not a metric for success all the time. Sometimes you need people who will try very hard and fail, because someone working less hard would eventually fail too, but after many more months. Or maybe it is an indication that there needs to be a change in strategy. Negative results are also results. I think that people too often only consider positive results. Sometimes you work hard, you did the best that was possible at the time, and the feature that you created sucks. Whether it is a success or a failure depends on the context: Maybe you did a bad job completing the task, or maybe the task was wrong from the beginning. Or maybe you demonstrated the ability to work hard, and you were just given a task that was not your strong suit, but maybe this other one suits you better. Without proper context, it is hard to evaluate in absolute terms.
"How many people have more microservices than customers? ... Throw that in a monolith" Careful, you're starting to sound like DHH and the people who like Ruby on Rails for that exact reason lol.
I definitely feel the same with remote work. It's very convenient but there's always something missing that makes me engage less or zone out during the conversations compared to in person meetings.
I hate this mythology of companies always choosing the best for the customer so much.. how can you spell this out while talking about the situation where companies literally hurt progress because they are busy hoarding money
I am a software engineer but not in the gaming industry. To me WFH affects work a lot because communication is very different imho. It's not necessarily about meetings but about the fact that it now make people only communicate when there is stronger purpose. We feel bad about calling coworkers just to share a random idea, feeling, discovery. When in a office, we just do it. It creates a way more utilitarian ambiance and less creative, less innovative. That's my experience and I started WFH 2 years before COVID, I am happy to now be back 2 days a week in an office to work with my colleagues IRL, cracking jokes, trolling and sharing random thoughts about what we do without having to feel I disturb/distract them from their task. I can naturally see when it's a good time to talk.
My coworkers and I used to sit in Discord all day and chat, listen to music, and work. It was great but I doubt people would be willing to do that in most companies.
I wish i could've been there to watch this live because around the 24:00 mark i could've interjected a bit of information about that game and company behind it, also its from 2010 and i never thought id see it again let alone on steam and in a primetime video
I don't have a lot of pro game dev because that's not my main stuff but the code quality and standards are way worse in the gaming industry than normal backend or even web dev. What they don't realize is that it actually make the devs actually way slower than developers with better standards and code quality
Dev time sure, but not speed. A lot of game dev is hacky and unreadable as shit because its fast. IE: fast inverse square root. Games ultimately care about code execution way more than readability. Try to code a game in "clean code" python and tell me how that goes for you LOL.
@@JathraDH you missed the point (or trolling ?) I meant game dev time. Faster devs means more features or a higher quality game (if they care about their job that is)
@@captainnoyaux Gamers do not want a game that comes out in 1 year and is full of bugs and runs like shit. They would much rather wait as long as it takes for the game to be done and release finished. You really sound like you aren't a gamer at all. Who is trolling here?
There are 2 big things that make it much harder to write "clean code" in game development. 1) You usually don't have a very clear idea of what the game should be when you start making it, much moreso than other software. You usually go through a ton of different iterations, and see what does and doesn't work well - features get added and removed all the time, and the thing you're trying to build is a moving target which makes it hard to plan out in advance. 2) On average games care a lot more about performance, which of course means that when there's a tradeoff between maintainability and performance they're a lot more likely to pick performance (depending on what they're working on of course). It's also pretty difficult to test any part of the game in a vacuum so to speak - unit testing is very difficult because nearly everything in the game is interconnected and you can't really test one part of the game without setting everything else up too a lot of the time.
I second that all of my best ideas have come from grabbing coffee with a fellow engineer. You grab a napkin or a white board and start drawing lines and boxes... And then someone erases one line and says "what if..."
I like your videos, but what kind of sociopath overlays himself on top of the author of the video of they're reacting to? 😂 It's like he's peeking over your shoulder!
@ThePrimeTimeagen Haha, it's all good, it just looked so funny! Maybe future streaming programs will be gpt integrated, send a frame or two across and get an overlay color and position back in json. 🙂
My experience of working remotely is 2 things: 1: A feeling of isolation: This can happen more so in dev because engineers are often "left to get on with things" 2: Inefficient communication: When communicating in person there are so much subconscious information, body language etc. that you get from in person that when you communicate remotely this is either missing or very difficult to pick up on
12:15 you underestimate how bad some people are at communicating in person (those ain't your netflix programmers). Text messages give them a little more time to think and you'll get actually better results that way
Copyright and patents only exist because of the government, they do not exist in a free market. The law is pretty explicit in saying they are for monopolization of a product, and if you ask me copyright and patent law dose not help incentivize invasion.
Consumer choice isn't a thing. Same argument as "Americans are fat because they choose to be." If consumer choice was a thing we would have to suppose no monopolies, which isn't how capitalism works, because if there is only a few companies monopolizing a sector, they can dictate price and consumer choice, that is to say, you have the illusion of choice. And on the dichotomy, the government should work for the people but it works for capital so saying "capitalist government or personal choice" isn't a dichotomy, it's the same thing, both dictated by the monopolies that control the government. Also I would much prefer a democratically elected government be given a monopoly rather than a bunch of despotic capital owners with diametrically opposed intrests to 99% of the population.
About that "more microservices than customers" is quite interesting. There is this game that blew up, Helldivers 2. The first one had peak player count of 10k, the second one got up to 400k in a week. And they didn't design their backend to be THAT scalable, and now are fighting to keep it working. I can't wait for some kind of a blog post from Arrowhead about that. They are hardcoding hard cap for total player amount, because they cannot just "buy more servers". Also, the game is so good, that even if people have to wait 10-15 minutes to get in, they still do, and as soon they increase the cap by another 100k, it gets filled in one hour. Defeated by your own success :D
First off, obvious bot account with bot upvotes. So annoying that people can push whatever BS narrative they want these days and gain visibility by using bots. But since you are now the top comment, I'll address your point. JS does not scale with modern app expectations. It's only viable if your "app" is just HTML with a little JS sprinkled in, and not very dynamic. Otherwise it turns in to a mess of spaghetti. or you end up creating your own framework. And yes, we can fantasize about going back to that, but it's not going to happen. Honestly the only place ever see that is in the r/webdev circle-jerk anyway, or with newbies who are intimidated by frameworks and don't understand their purpose. Start thinking like a programmer and you won't even have any issues. Yes, npm is crap, JS is crap, and the ecosystem is overwhelming, but it's really not that bad considering how much crap programmers have had to deal with for 40 years.
9:00 Games haven't gotten worse. AAA games have gotten worse. And that is because they are a pure transactional product. They are done with the safest checklist to guarantee economical success. You cannot make transactional pieces of art, except for when the art is just a money laundry scheme. AAA games do not have a spark because they are not done for gamers and artist to the fans. They are organized by suits who have very good studies on what is the maximum price that microtransactions can be before people started to revolt according to market polls. The Witcher III and Baldur's Gate III are two examples of games done by smaller companies, with smaller budgets, but done with the intent of being good games first, and let the art attract people who will pay money for that art.
But it is the same with big block busters. Now especially with Marvel movies, which are basically exactly the same movie but with a different skin on top, changing the locations and the characters, but exactly the same movie otherwise. When the goal is not to make a good movie but "This movie costed 300 millions to produce and another 300 to promote, so it either makes 600 million or the studio will go bankrupt", then people play it safe. And safe is boring. Safe is the same thing done before because before it work. Safe kills the spark of imagination and innovation that is fundamental for a good piece of art.
I think that Netflix and other streaming platforms are doing a speed run of this. They all have had very bad shows, but every now and then they produce a "modern classic". And that only happens because there was no fear of trying new things. However, as shows have gotten more and more expensive, they are suddenly going back the movie route, and staying safer and safer.
Gaming companies went from "those nerds who play video games as adults, ahah" to "actually they make more money than films?? WE NEED MARKET ESTIMATIONS NOW!!". It is simply a change of priorities from the top down. It has nothing to do with working from home.
All of my greatest ideas, and by far my greatest work, have come while home working alone, or after everyone left the office at night. I don't even know what Prime is talking about here. I can't even fathom it. I've never had a meeting where I walked away with anything I could use, unless I was coding with a single person at a computer, bouncing ideas off each other at speed. Any group meeting? Nothing.
4:33 Your point is way more valid than you probably think, it's like maximally valid. One example of hard work that nobody properly appreciates; women that take care of the house and the kids. They work all day. They work hard. Is their work excruciatingly vital to society? YES. Are they paid? ZERO. NONE. Are they the source of all life? YEP. Why are we not paying them for it? COMPLETE AND UTTER NEGLIGENCE AND BLINDNESS.
Taking care of your own house and family is not a "job" who deserves to be paid, unless cleaning my own room and take care of myself is a job too. That comparison is a consequence of hating to do simple responsibilities and housekeeping. I mean, I need to care of myself in order to be useful for society so "Is my excruciatingly vital to society?" YES.
9 місяців тому
I love remote, it allowed me to have a life. I didn't have it at all before. But. There is something about sharing space with other people in the same mission that helps. I still feel 400 times more productive at home.
I'm in enterprise web app development. I've been working fully remote for about 9 years now. It's not for everyone, but it is for me. Some people want all in office, I think most will be good with some manner of hybrid. I've worked at remote first companies before, and you really do have to attract the right people and build the right culture to make the most of the advantages it gives you. Additionally, you have to be okay with letting go of the wrong people as and when needed - and to not dilly dally about it. One chatter mentioned that in-office is necessary because some employees need to be "watched" to be productive. I'd simply counter that by saying if you have an employee like that, in any situation, then you really should work on having them _not_ be your employee. Particularly at the level of a skilled craft that SWE is considered to be.
About remote communication - it’s good to have an established baseline in person before going full remote or mostly remote and when remote put in the extra effort to be explicit.
@@skellington2000Hardware and software doesn't get faster and scaler better by getting simpler. Ivory towers aren't reality. And yes the problems people solve today are genuinely harder. There was no need to serve webpages to millions of clients in the 90s. Realtime raytracing was impossible in the past, no thanks for simplicity. Transformer architectures are more complex than the simple linear regression models of the past. Etc.
They are not. Most of the problems code monkeys solve are trivial, but code monkeys cannot think, so they make them orders of magnitude harder than they should have been. Code monkeys cannot build things from scratch, they combine dozens of third party dependencies and then they're lost in tacking the emergent complexity of the Freankestein monster they created of a pile of necrotic pieces.
@@thomassynths lol, you're funny. Raytracing is algorithmically *simpler* than the traditional rendering. Modern unified shader GPU architectures are simpler than the old specialised pipelines GPUs. You clearly have no idea of what you're talking about.
@@vitalyl1327 I do know a bit of what I’m talking about. I wrote CAD software for a decade and wrote GPU mesh emulation for Apple’s Metal shading language.
Prime, it would be beneficial learn a bit about Vax and DEC. OpenVMS was ground breaking for distributed systems. A datacenter was destroyed in 9/11 and the cluster stayed up.
Graphics engineer here. The biggest graphics sin in the gaming industry is that they are repeatedly choosing Direct X over Vulkan. WE ALREADY HAVE A CROSS PLATFORM SOLUTION THAT WILL RUN ON EVERYTHING. WHY THE FCUK ARE WE NOT USING IT ?? But on top of that the games are built to maximize monetization which just sours the gameplay. The gaming industry suffers from being money hungry like everything else in life.
Agreed, I'm honestly surprised the larger players in the gaming industry haven't forced SDL + Vulkan or GL on the console manufactures so they only have to support 1set of APIs!
Well you simply can't build everything on a platform that favors AMD it would be bad for Nvidia share holders.
DirectX Docs are far better than Vulkan ones unfortunately
Unity dev here. Vulkan backend hardly works and when it works I get slower frame timings than with DirectX. I know its unity that at fault not Vulkan devs. I really want to switch to vulkan, I really do. But we are in the hands of the engine devs sadly.
vulkan doesn't work on my gts450
my main problem with non-remote is spending hours commuting + hours not doing stuff to get to the 8 hour mark. Meanwhile there's no commute in remote and you get to finish when you finish your tasks, not when the time runs out
I agree 100% with the commute but, even when there isn’t 8 hours of work tasks to do I usually try to spend that time in growth development. Whether that’s watching news to keep up with changes in tech landscape, running a dummy project / leetcode exercises, or RTFM or reading changelogs on some relevant tech, or preferably looking for and potentially fixing tech debt in our stack; I can usually find 8+ hours worth of stuff to do.
But it’s nice to intersperse meaningful family breaks every 2-3 hours (at least with how my brain works).
I’m in no way pro-office and strongly support wfh. I just think the 2 hours of work is a very weak argument and couldn’t imagine spending that little time on growth oriented tasks.
This is a very independent mentality, which might be okay, but I think hints at the problem. That time after "you" finish "your" tasks is available for collaboration, mentorship, etc
Commute is the worst, 2 hours back and forth total, basically count it as working hours that I'm not getting paid for so I've just begun using the commute to think and sketch out the problems I have and just log them as working hours. Allows me to justify arriving 30 minutes late and leaving 30 minutes early some days, not that it's particularly strict where I work.
Haven't worked from home yet, basically my fourth week at this job and it's easier to figure things out when I can ask people in person, but my god would it be so much nicer to just cut out that 2 hours for absolutely anything else, like more sleep.
@@chadyways8750 "easier to figurethings out when I can ask people" so do that, wfh doesnt magically mean you cant ask people things XD
That would be very beautiful but it's not true. If the boss knows you're fast you're just going to get more work
computers keep getting faster but software stays the same speed
eventually someone's going to write a required piece of code for the modern software stack that runs inside of a minecraft instance in redstone and it will be running on a bank server
Read that first line in Matthew Mcconaugheys voice.
alright alright aaaalright
>:(
With this treasure I summon-
Have you not seen From Scratch's building a 32-bit computer in Terraria? ua-cam.com/video/zXPiqk0-zDY/v-deo.html
Not really.
What happened is that the slow, complex software became faster, and replaced all of the simple fast software. So what you have are complex applications running at normal speed.
Take graphics as an example. In the early 90s, something like Blender would've been run on timesharing graphics servers costing upwards of 2,000,000 USD. Rendering would've taken several days per second of film and cost hundreds of dollars in energy usage.
Today you can render a scene in Blender in literal minutes. Higher end computers can do it in a matter of seconds. Long, multi-day renders will render scenes that are more complex than entire 3d animated movies from the early 2000s.
A 2,000,000 USD setup could render a movie like Soul or Coco at the same speed it took for a 1990s setup to render Toy Story. A modern home computer can render the entirety of Toy Story in real time, and at 60 frames per second.
In 1980, it took minutes to render a single frame of a mandelbrot set. Now you can render a mandelbrot set in real time.
It's so weird hearing people hate on WFH, I get so much more done at home it's not even close. That + commutes and it really is such a big qol improvement to working exclusively at the office that I'm very glad our company has been mostly remote ever since the pandemic.
Some of us do much better reading and with asynchronous communication. Others just keep can't track with that.
WFH also means transpeople don't have to worry about co-workers possibly being weirdos about them existing in the same space
I’m way less productive at home but if I had a long commute it would cancel that out
It depends on what you do. In game dev you usually have a ton of work and need very tight communication to make things well. WFH really makes this harder.
If you're at an average job it doesn't really impact performance.
@@La0bouchere Ehh, I don’t see how the same doesn’t apply to other large software projects. It’s very hard to develop some legacy systems in tandem and the same goes for new development too.
The netflix comparison is specious. Paying for a service is not the same as paying for a physical thing that the company then won't let you use except in the ways they explicitly approve. If anything, it comes down to a similar argument as the right-to-repair movement. If you want to provide some sort of cloud GPU service with a proprietary, locked-down interface, go for it, but if I pay for a physical object that I now own, I should be able to use it when I want, how I want, and anything less is unethical.
One company mixing hardware and software is unethical. It's either you sell the hardware and the software is free/open or you sell the software separately, interfacing with the hardware the way any 3rd party developer would have access. The apple model is how things got so bad. Tech has probably been held back at least a century because of it, if we assume we missed out on exponential progress.
I’ll say this about in person meetings. I don’t think the ideas are necessarily any better. Having immediate feedback on them makes a stronger emotional connection to them and therefore they will feel much more significant than ideas that are had in your home office. The added facial response and non-muted audio response will further increase that as well.
only if you care about your job, I just care about paycheck lol,
@@jordixboyIf that's your attitude then your opinion also kind of doesn't matter. You will always pick whatever means the least amount of work for the most amount of money.
Essentially, it means you are indifferent to the effects on your job as long as it doesn't affect you, which is the reason why your opinion doesn't matter.
@@CottidaeSEA ,y opinion matters equally as yours. Never said I will do the least amount of work. Everyone has a different opinion, all matter, or none matter.
@@jordixboy Did your reading comprehension stop at first grade? There is a reason why your opinion does not matter and it is indifference. Unless you prove you are not indifferent, your opinion will always be one that is beneficial to yourself regardless of the cost to your company. That is not a valuable opinion in this. If you have an actual valid opinion, fair, but I'd still invalidate it on the basis of prior statements.
@@CottidaeSEA Is your reading comprehension low? Everybodies opinion is different, equally valuable or not equally valuable. Your opinion is that, cool I respect it, my opinion is that your opinion sucks and is not valuable at all.
My main problem with non-remote is that unless you are a contractor, no matter how efficient you are, if you finish you work in 50% of time, your boss will just find something more for to you to fill in.
Yeah fuck, the "how many of you have more microservices than customers" hits a little too close to home. It's absolutely ridiculous how slow and bottom up everything has been implemented in our project and it still sucks. Like we've developed abstractions on abstractions and we're still not even done with the core feature set....
The Pizza Hut website has a loading bar. It takes a couple of seconds to render a page for a pizza, when the combined dataset of all possible pizzas and locations likely fits in a modern CPU cache already. What the heck is it doing for two seconds?
@@MatthewSwabey I’ve heard that a lot of websites actually have spinners and what not to show the user that the program is ”working”. Apparently some users get confused if things just work quickly.
If we’re being honest though, that website client that’s full of complex math (for some important reason I’m sure) is querying a bloated service that’s integrated with five more legacy services that are slow as time lmao.
@@MatthewSwabey Mining crypto with your CPU.
@@MatthewSwabeymy local cinema mobile app takes multiple seconds (2-5!) to figure out what cinemas exist. Bro, there's like four, you build them once a decade at most, I think you can send an app update if there's a new one.
Hot take: people care about max productivity too much.
Yes in-person meetings work better but who gives a fluck.
Remote has the potential to make everybody's out-of-work life better globally and that's a more important goal.
We're not there yet, because cities are still 100% structured around those mfing offices but we'll get there someday.
in person for extroverts maybe. I cant retain or process any info during an in person meeting, I have to write everything down then read it later. So at least for me, meeting are 100% a waste of time.
It's not productivity imho, it's creativity that remote just absolutely destroys. Even as a shy and introverted person in person game dev is so much more creative and I'd maybe even say less productive. In person is so much easier to bounce around ideas and gauge people's reactions/emotions to things during design and at later stages when intermingling gameplay code and assets.
I can only speak from smaller studio sizes but my knowledge of the big places is still similar just not as much, but again things like gameplay code has people talking and gauging each other in quick succession in person that brings freshness to it. As for standard SWE, yeah in person means fuck all.
This is a sane take.
I work remote for 11 years now, and in my observation, the ones who think remote is worse are the people who worked in an office previously. The transition is hard for them and they want a deeper level of communication which in my experience does not achieve anything work-related. It is just more pleasant.
Sure, you can go over to someone’s desk and start talking which is easier than scheduling a call in a [MESSENGER], which is objectively faster thus more productive, but there is a solution to that to: Discord like voice channels for the duration of the work day. Hop in and start talking.
Also, I’ve been able to work with companies all over the globe. Some of those companies were able to reduce their costs by employing me instead of local developers, if those were available at all.
So, I won’t be harming myself and making my life more difficult by “maximizing productivity” unless it gives me a competitive advantage somehow. I prefer to work and deliver, instead of being good loyal company bot who lives in the office.
Right? Some people are way too numbers-driven without putting a fair share of focus on workers. Just reasons, logic, data… Like, to some people that’s all that matters at the end of the day. Not all the external variables that influence those numbers
"Official" meetings -remote or in person - suck. Most of them end in "we need to do more research" or similiar "non conclusion". The most productive meeting were always ad hoc - someone come to chat about a problem, some overheard and chimed in, someone working on wholly different issue recognized that both have common solution etc.
15 minutes banter around water cooler may be worth 3 hours of power point presentations...
I somewhat agree about the remote thing. The reality is that I love remote because it enables me to have a better work life balance, for example, I can do my laundry, walk my dog, cook myself lunch, etc. However, as a junior, sometimes it's hard to know if I'm doing things correctly. Simply because I don't get those same social cues. Luckily, I'm on a great team that has that culture of open door communication. But even then, it's annoying to have to slack someone, set up a meeting, remind them that you have a meeting with them - especially when you just have a simple question.
I love remote, but def be prepared to go the extra mile when it comes to communicating to your team.
If it's a simple question you can just ask the question instead of setting up a meeting though. There really is no difference between showing up at someones desk vs showing up in their PMs, asking them a question vs asking them a question, them giving you the answer right away/"Give me a couple to wrap up this thing first" vs them giving you the answer right way/"Give me a couple to wrap up this thing first" except that it gives FAR more flexibility if you ask a team instead of a single person and first that has time answers, and even working accross time zones.
@@TurtleKwitty This. 98% of my questions are asked in group channels and then if I feel like one person might have more insight I message them directly, or if we've talked about it before.
If I get no answers then I get my team lead involved because stuff shouldn't go unanswered.
dms and setting up a meeting are not the way to go for a 'simple' question.
If it requires a meeting, then it wasn't so simple which means that you're probably doing pretty well?
Having it in DMs and in a meeting also means that the next person to have to figure it out won't have the info available, since it's not in slack or in docs (because how often do we really update docs afterwards)
I don't get this, do people just walk up to other peoples' desks when they're doing something ? Even before pandemic times, we would check if someone was available on communicator/skype/teams/whatever before going to bother them. WFH is the same process except I don't have to walk anywhere and its just a quick teams call that lasts less than 5 minutes
@@wacky.racoon They do. ALOT. And get mad if you're not instantly available which is why they hate that they can't forcefully interrupt when remote
@@TurtleKwitty That is so toxic
The point hes missing here with the whole "consumers choice" debate is that its not about the "government deciding what phone you can buy" or whatever but its about how Apple, Microsoft, Google, Amazon etc. are all trying to lock down their ecosystems by removing more and more points of interoperability to the point where as a consumer i am basically forced to buy (for example) a mac if i want my Iphone to interact with my computer. The worst thing about this is that this forces out any potential competitors and is really bad for consumer choice in the long run.
Imagine if Apple was forced to open up their ecosystem so you could have the same experience with connecting your iphone to a Windows or even Linux system that you would have with a mac.
I agree. His argument on that point was very weak. The purchasing power of the consumer is just not that influential. Companies do not take a purchase as some kind of vote or intention. The profit incentive of a company can and often does diverge from their own customer base all-the-time.
Also, we don’t live in a dictatorship. It’s our representatives who regulate markets. And the government regulates all kinds of markets significantly for the benefit of the consumer, mainly safety. I see no reason why tech should be considered somehow more special than construction, rail, food industry, etc.
Consumers can just never be educated enough to know whether a given product is safe or even better than a different product. In such cases, a government action is the consumers’ only recourse. It’s either that or boycott, which is the hardest thing to pull off of all these actions. There’s just not another option in our society.
And what choice am I losing in this scenario? I am an iPhone and Mac user. So the government forces Apple to open their platform. What have I lost? Is Apple just gonna call it quits and stop integrating their software and hardware across their ecosystem? I think that’s kind of preposterous. It’s not like Apple is about to go out of business. Clearly Apple will continue to make their products, and I will just *also* have other options that I did not have before.
This is definitely a case where more competition is good.
And then the conflation of government regulation with a government TAKEOVER of Apple is just beyond bonkers. Like, the government regulates virtually every industry in the country in some way. Do we really want drug companies to just release anything they want, even if it kills people? Is Apple suffering now that the government is forcing them to abandon their proprietary Lightning port for the standardized USB-C? The government is very good at regulation actually and has specifically forced standardization into many industries before and it’s often for the best. We’ve literally never had an unregulated economy in American history.
Indeed! But then it just shows that no one is an expert in everything and we must always be vigilant to remain open minded and not let bias blind us to jump to the "feel good" point of the discourse or else your just adopting someone else's rhetoric.
@@A_Dylan_of_No_Renown while I personally see the force to use USB c on Apple phones as better for the world... We also have to look at who makes money from the USBC standard and who doesn't. Though I imagine the whole things mute cuz apples reason for going lighting bolt in the first place was an attempt to monopolize that space as well as all the other areas of the tech stack it dominates. (Now to go look how open source the USB standards are and if there are any patents still active.) Ah, the rabbit hole winds ever deeper.
@@larkohiya Lightning exists because Apple wanted a smaller connector to 30 pin that was reversible and more durable than micro B. USB C came two years after lightning and was primarily developed by Apple and Intel for Thunderbolt 3 and proposed to USB-IF so that they could support USB on the connector and use it to standardize on a single port for laptops. Hence why for a while Macbooks had only type C ports and Intel motherboards with certain chipsets had Type C as standard before it took off.
"The job as a game programmer is to make a good game."
I honestly never considered that before. It really makes you think.
As much as it's also their job to make a good software, a good software is only good if accepted by users.
How many open source packages have all the technical detailed documentation you can think of, and never ever or only vaguely a good user's manual?
How many packages I have made to run after whatever absurd procedure was required for it, and not being able to do anything on the running software because you don't know what to do.
What is the process and what is the problem this software is trying to solve are the 2 questions a developer has to keep in mind. Not just implementing stuff because there is a library for it.
i thought it was to make as much money for the company as possible
@@funicon3689 But you have to have paying customers, before you can start cutting corner and make the big buck, as scale economy is always increasing then decreasing.
Programmers only get so much input into whether the game is good or not. Plenty of terrible games out there with great code (and vice versa, come to that). The designers have overwhelmingly more influence on the quality of the game than the programmers do.
Well, that's more the job of the game designer.
The job of a game programmer is the job of any programmer, which is to write good, predictable software.
The programmer could do everything right, and the game would still suck if the designer is an idiot.
"You're going to need a bigger monolith."
Funny thing. I am literally overseas right now meeting our remote team. There is something about the in person engagement and the mindset it brings. Ideas seem to bloom and riff in ways that are harder remote. The feeling of epiphany when things just bounce in a room when just talking about an issue with a loose agenda is something that happens rarely remote.
Zeekerss (lethal company dev) made it to the charts with a simple a game! I have A LOT of respect for him.
edit: typo
Lethal Company is so graphically intense the game is rendered near 480p and upscaled to whatever your resolution is
a lot*
@@JorgetePanete
@@akr4s1a Because Unity only supports volumetric lighting in it's HDRP renderer which isn't designed for games, it's designed for animation, tech demos and product promos (ie: Most cars in car commercials are actually 3D animation) and as a result has awful performance. If you made the same game in Godot or Unreal (engines that built proper volumetric lighting systems for themselves), it would probably run a lot better. Which just reinforces the video's main point that video game developers are drowning in needless complications. The fact that Zeekers needed to use a specific renderer that's not meant for games and do a bunch of hacks to make it run decently attests to this.
@@kaijuultimax9407 The thing is that the fun complication of the volumetric shading on unity ended up driving a creative aspect of the visual design of the game.
Having no complications is bad for art, art need to be boxed for creativity to happen.
I graduated in 2022 so I've only ever worked from home, and all but 1 of the peers in my team are on a different continent. Luckily most of my work isn't collaborative, but when we are developing a feature or fixing bugs together, I feel like things could move a lot quicker in person than they do over Slack and Zoom.
That's why you get in a call, then it's the same as in person
You'd think but there's a lot of nonsense involved in setting up and starting/finishing IRL meetings that gets glossed over when people are talking about it
Contra Prime I've never had good/great ideas at meetings it's always been elsewhere
Blaming WFH is a cop out, most games are ruined at executive level by people involved in the game design who have no idea about gaming and whose primary focus is business-driven. We see it all the time, Creative Assembly recently has had a major shitshow and we found out from leaks executives have been prioritising shitting out DLCs and sequels for Total War over addressing tech debt and interfering with game design by "brand managers" overriding actual game designers to maximise short term profits. CoD wouldn't be amazing suddenly if Infinity Ward devs were working 100% in office 12h days because fundamentally C Suite has made the decision to rush MW3 and make it a copy paste of MW2 with Warzone maps as campaign maps to save on development costs. You think passionate game devs made those decisions? Grow up.
Bobby Kotick
"Best form of communication is in-person"
True, but comes with tradeoffs:
- gets distracted by boomer who doesn't keep their phone on silent
- 2hr daily commute
- Can't effectively buffer/batch communication (25 minutes of work, 5 minutes of answering messages, repeat) because someone comes to my desk every 5-10 minutes and interrupts me
" Can't effectively buffer/batch communication "
When I used to work in person I would basically block all communication for the last 4 hours of the day by scheduling a meeting alone with myself for the first four hours, so I could actually get things done.
I also had a goods-man agreement with my coworkers we don't interrupt ourselves for the first hours so work can be done.
Depends on where you work, I guess. I'm working at a game dev startup with some friends and I've noticed we're a lot better at communicating in person than over discord/slack. Hard to keep focused too when you work, relax, eat, and (for some) sleep in the same room all day. Really easy for some to avoid work, get distracted, or a myriad of other shit compared to being in an 'office' with the boys and just hammering out code. Not to mention how much of a godsend it can be to have someone else take a look at bugs. The amount of times one of us has caught a bug in minutes that the code author would have spent hours looking for has eliminated so much time that would have been wasted trying to track down a bug on your own.
@@michaeltokarev1946 You can share your screen and even control using many tools, eg Microsoft Teams (use the "New" one, "Work" edition, probably, and reboot when it keeps crashing).
I work remotely, Teams and Slack create more interruptions than I ever needed. Can't drop them currently because of the support requirement of my role. While in the office, people saw the headphones, and left me alone.
Or ...
- gets distracted by narcissistic millenial who can't stop making selfies in meetings. 😛
Physics Grad student here, I get significantly more done on the days when we all go in and I can talk to people in person. It isn't even close
face to face produces more synchronicity in micro-movements which emotionally uplifts ppl and gets them into flow easier. It's something that happens in concerts, when you get emotional in (home)cinema close to someone and when you look at some exciting stuff in the same room and discuss it. Read it in the book Interaction Ritual Chains and probably messed up the summary quite a bit.
I agree that in-person communication is the most clear, but it comes at a very high cost. Commuting, geographic restrictions on hiring, and in-office distractions all make employees way less productive at the office.
On the other hand, there are also at-home distractions and chores that reduce productivity at home.
My productivity dropped greatly the moment I had to change office rooms. It sucks.
Efficiency is sort of irrelevant, just set pay for Endprodukt based goals/tasks
You can circumvent worker rights by paying minimum with most of the actual pay locked behind goals, this also means work time depends on how fast the employee is (which can obviously be positive or negative for you)
@maxmustermann-zx9yq have you considered the possibility that your employees would jump ship for stable pay
The dichotomy isn't between consumers and governments though, it's corporations vs government currently, and even that isn't really true since what we actually want is companies to be *somewhat* interoperable and "the government" doesn't want this job.
I do pure remote and it's absolutely incredible how much I get done, I cannot understand how people get anything done in the office.
last two days I spent configuring Unity's camera controller called Cinemachine. I had to give up. No matter how you configure it, there is always something that is breaking the experience. The whole time I worked on it I felt like Jonathan Blow is breathing behind my back and judging me for it. In the end I coded my own camera controller that does exactly what I want in 30 minutes. The more I code the more I understand what Jonathan is talking about.
Brother I'm autistic, the amount of body language I get in person === the amount of body language I get remote
Real shit
I've worked remote for 20+ years. Most people can't do it well in my experience. It's easier in some ways, but perhaps harder in more ways. I agree effective communication and collaboration are MUCH easier when you're in a room with someone. I love going into the office when I can. I'd like to try a combo of in-person and remote work, but haven't had that luxury yet.
The open-up comparison with Netflix is flawed. The problem with mobile is that there isn't really much of a choice. The moat to create alternative options is just indefinitely wider than creating another recommendation engine.
Both Google and Apple are providing services that are essentially system relevant. They provide world infrastructure. That's why they need to be held to a different standard.
I am not a fan regulating companies - but for markets to work their magic we need more than two options.
Apropos simple vs hard problems - I can highly recommend any and every developer take a good look at the Cynefin sense-making framework. Understanding the difference between clear, complicated and complex problems, and which problem solving strategy lends itself to which problem space, has proven invaluable to me.
Things like the walled garden of Apple are a major problem for tech. It is textbook anti-competitive behaviour and they should get in trouble with anti-trust.
I worked two internships in school so far, one was remote and the other was in person. I will say that it was a lot easier to work iteratively with my team as an intern when we were in person. The disclaimer is that what an intern needs from their team to be good vs what a normal dev needs are different.
I found it annoying and tedious to get any help from my remote team because getting them on call was a little bit of a struggle when the issue would take like 5 minutes on some domain specific thing there isn't documentation on.
6:10 It doesn't matter how hard you repeated the same take. If some guy out there with bigger number says the same thing, it's his take now
4:10 that is such an important truth, after learning this after years into my career I was able to get promotions and get any position I wanted, before that I was obsessing about the small technical details management couldn't care less about.
Sounds like the beginning of a -10x developer story arc
If you're looking for just an interface to the GPU, wgpu is probably the best option. It's a 2 layer library where the backend is DX12, Metal, Vulkan, or WebGPU (in WASM), and the frontend is always WebGPU. wgpu is also what FireFox uses for its WebGPU implementation, but its WebGPU backend works on Dawn (Chrome's WebGPU implementation), as well. Basically, you have 1 API that works everywhere. You can use if from almost any language (C ABI), it compiles its shader language to the backend's shader language, and it's written in Rust. The alternatives that still has that much portability are game engine abstraction layers, but many game engines are using wgpu as their own abstraction layer, now.
Main problem was the shader language was changing every ten days while they were specifying it so their compiler is still *really* rough. I think they *still* won't actually tell you what expression/type was wrong in errors, just give a random internal id?
@@SimonBuchanNz I haven't run into this problem. If it happens often, please file an issue.
@@jaysistar2711 I might be using the error reporting wrong, or it may well have been fixed (it was a while ago), but it wasn't a case of "often" - every error message that referenced something in the source code used an indirection that Error::display couldn't resolve. "The expression [6] of type [16] is not assignable to type [43]" was my experience for the whole time I used it!
Game devs? I'm drowning as a fronted developer, working with framework... to a framework... to a framework. And I wish I was exaggerating.
I miss ye old days writing pure JS.
Not to mention all other additional complexity coming from Webpack or TS.
@@anj000 No one is forcing you to use the latest unstable unproven framework. Just stick with something and get good at it, even if that's PHP. Modern web development is a made up problem to make an easy problem (making a web site) seem incredibly complex.
@@tapwater424 You nailed it!
@@tapwater424 No one, except for employers. They can be a pretty stubborn bunch.
@@tapwater424 also no one is forcing me to eat and have a place to live. You are right.
But please note that I have a job, and having this job is predicted on the fact of working with this abstract framework.
28:30 This is a false dichotomy. It's not about government control telling companies what they have to use, it's about whether technology can impose artificial scarcity. The consumer doesn't have any control over what Apple or the market does. It's like saying that market forces can counter planned obsolence.
Imagine a wrench that can work on any nut/bolt but it has an rfid reader that only allows it to work with proprietary hardware. You would think that the market would select against that but it can't if every actor behaved that way. It sounds dumb with hardware but software makes that kind of stupidity possible.
I 100% work way worse from home. If I lived alone and had a dedicated office room I might not, but that's not in my cards.
6:04 I'm sorry Prime. But the take is his until 70 years after he pass away.
The one part of this equation that I'd rather the government control is internet services. The alternative is pitiful in how often it causes consumer grief. At&t literally let a fifth of their national user data get breached and didn't tell anyone until that data was made free for anyone to steal.
4:00 I saw this in another clip and a lot of the comments pointed out that people in the office do watch each others hours and pettily focus on effort put rather than results. I'd like to add on that while you can't control others and 'nobody cares' is technically wrong, you still shouldn't care. Results are more useful to you, your employer, the world and whoever else might matter than your efforts and pain are. If people insist on you working and hanging around until you are tired regardless of how effective this is, placate them if you must, tell them to go f themselves if you can, but don't let it stop you from getting results in the easiest way possible.
I had so many fights over the 9-5 schedule in my last job, I kept not respecting the clock because I cared about results not being on time for the morning coffee.
I am like a wizard, A wizard is never late, nor is he early, he arrives precisely when he means to. And like Gandalf I could tank bosses and get all the XP too.
IMO the walled garden thing *IS* a false dichotomy. The options aren't just -
1. companies create their own walled gardens where people can't even make dev tools because the instruction set is hidden
OR
2. government forces an ISA on everyone
We do have the 3rd option that is literally being used right now for almost all CPU architectures out there. An ISA that is well documented that everyone can develop for! We need that for GPUs. We need GPU vendors to be publishing massive 100 page documents for their products just like Intel does for their CPUs. If they could even include super-in-depth stuff like the number of cycles taken for each operation then that's even better.
Every time I compare game development with web development, I realize how little value web development seems to have. The gaming industry is massive, with an absurd amount of money flowing through it, including what I've spent personally. On the other hand, besides Netflix, I can hardly think of any websites I've paid directly for. It seems like much of the web is just one big advertising platform, but how valuable is advertising really? Data collection feels like nothing more than a tool to serve more ads.
What I'm getting at is whether the poor quality of web development tools reflects the lack of real money in this industry (unless you're basically an advertising company pretending to be a tech firm like Google).
The value of the web is to be the collection pool of data for our future dystopian surveillance ridden, AI policed and controlled society. If people had to pay for this themselves, they might put more thought into it, and alter their behaviour. It’s free, specifically because the outcomes will nefariously benefit forces which you don’t want empowered. And I don’t think there’s a few master puppets pulling strings behind the scenes. This system has a behaviour that supersedes the sum of its parts… Buckminister Fuller’s synergetics somewhat touches on these themes. We’re in the early stages of machines agriculturalising humans ✌️
The global e-commerce market that the web supports is massive, we are talking trillions of USD each year. It dwarfs the global video game market. Saying web dev tools are poor due to a lack of "real money" in the web is a very strange viewpoint.
@@tom_marsden You just said yourself "that the web SUPPORTS" it is not the web ITSELF. Youre describing a sales market between people and businesses not the website itself. Not to mention, basically one website owns that entire market, or rather supports it.
"Nobody cares how hard you worked." THANK YOU. Unironically really useful to have said out loud.
0:35 TempleOS. I am being serious.
Holy C
Given that Terry Davis's goal was to make a modern and more robust implementation of a Commodore 64, which according to a lot of bedroom coding Gen X'ers, was one of the most fun to develop software (especially games) for; I'd say he accomplished a ton, including the fun and simple development environment.
🙄
he won
There is value in face to face interactions. Whiteboard meetings allow people to brainstorm and set a direction as a group. However, those days are dead because no one grabs a room anymore, it is a Zoom or Teams call. So if it takes me 45 minutes to drive 6 miles, to park 300 ft from the building, to sit in a cube and be on a Zoom or Teams call, then why do we have to be in the office? Management killed the creative vibe 4 years before the pandemic. Being remote for 2 years just solidified that just as much work can be done remotely, because the way we work never changed. Oh, and let's mention the office distractions from cube congregations, poor climate control, bad hygiene and bad lunch smells. I don't have that worry at home. For folks who hate their family or don't have one and need daily human contact, or believe that the only way to get promoted is to be in your manager's face all day; let them go into the office. Employers should offer choice for knowledge workers.
There is a cross-platform shading language which is production ready and it is called WGSL.
There is also BGFX
I've already seen work from home affect me: it took me 9 months of straight looking to land a job and just because I do not want to work from home when I am still figuring this entire thing out, I can't comment on that perspective, but my colleagues working from home affect me in the sense that if I have to ask a quick question, I am shit out of luck because the people I work with aren't particularly prone to using a communications platform in any meaningful capacity. If I worked with a bunch of mid 20 year olds instead of more mid 30 year olds, this wouldn't be an issue to this degree and as my generation matures, we'll bring this more rapid remote work in.
As for finishing stuff vs effort, yeah, I've had two tasks that I've taken 2 weeks to complete (among other tasks) and I've actually felt god awful for delivering such simple stuff so slow, but... power apps is something else, it's absolutely geriatric microsoft bullshit.
And building up debt... tech debt, project debt, design debt, jesus christ, let's just say that what project I am apart of now has been royally fucked because of an inexperienced team, but like, inexperienced not in the environment but just generally inexperienced. The codebase is all sorts of fucked, the structure is all over the place even with a design document, hell, the design is all over the place, you want to do anything, you'll be sifting through 1000-3000 line files with barely any documentation without source control, you have no ERD to visualize the database, there's no real documentation of any kind... and I am sat here like "sure, I could spend the next few weeks fixing ALL of this, but I would be delivering nothing in the meanwhile and thus I'd be working hard but delivering nothing." Just crazy how as much as it'd create a result that's going to make all of our life easier, I'd be essentially doing work for no real end result for the end user past "well, we might be able to develop slightly quicker possibly."
skill issue
This sounds like a horrible place to work even in-person.
@@disguysn na, i just took what i could which was a new team with no experience, market's fucked
11:30 - how stress sounds like
10:15, I would say it's a lack of chaos, conversations tend to be a form of chaotic order. WFH devs tend to have a an organised environment to some extent which does not cause the brain to get distracted enough to break out of whatever box it's in. In others words the right amount of chaos can cause a dev to think more outside the box then they normally would.
people saying WFH is at fault are just coping after releasing a bad game. I wfh for 2 years and absolutely kicked ass on the AAA game I worked on. People just need to not suck
skeel eesyoo
@@sycration ?
@@greensock4089he said skill issue lol
“there’s something i wanted to circle back to and i was hoping he’d jump on it”
THE LINKEDIN IS COMING FROM INSIDE THE HOUSE
From a consumer standpoint, there's a couple stuff that's invisible to us. Everyone says games have grown in complexity, but from our perspective, games are simpler than ever. Shallow pieces of overly-gated and on-rails "bottled" experiences. Plus it's really hard to agree when every new title is made in the same engine with as little effort into detailing as possible (look up Batman Arkham Asylum vs the new SuicideSquad game). Lastly, every-new-title also uses the latest 2 marketing solutions: DLSS and RTX. Games look like ass but "b-but the rays of light are casted real time and the resolution is upscaled to 4k".
When gamers see gamedevs consistently crying for what's about 10 years now about their work being hard and videogames getting infinitely complex, and wanting to raise the price... but the games are actually worse, yeah, people will laugh at them.
Go play games from 10 years ago, on average the games will be worse you just remember the experience. Typically we will remember the good experiences .... But the game itself is likely to be worse.
The other thing about WFH is that we have developed our "office culture" for decades, with many failures along the way.
But people have given WFH a real chance for a couple of years and they are "completely sure that it doesn't work".
It will work, if we want it to work. People will adapt. Calling someone on Teams without a meeting scheduled will be the equivalent of knocking on someone's door, for example.
It is for me ridiculous to say that WFH does not work after such a short trial. It is like saying that computers are actually useless because they are super expensive and occupy a room, and Joe down the corridor can do math operations faster than all those spinning disks. For a very long time, computers were seen as a niche to be used in certain applications like military decoding, but nobody will use a computer to do X, because we do X now in this other way, and as you can see, it is so much better and has worked forever, and I cannot imagine myself using a computer to do X.
I am working in gamedev with cocos creator 3 which is open source. And one thing in that engine bugged me really hard, so i went ahead and forked it, change the implementation of some element, extended it's features and used it in my game. Felt really empowering ngl. I would cry if i had to do it with something as complicated as react
LIked because of the Vim weekly meeting. I'm going to start doing that too 📆
"Only thing companies should optimize for is making money" is same as saying "Paperclip AI should only optimize for making paperclips". Those who know, know.
I think the IRL vs remote argument simply boils down to the assumption that people must be self-managed when working remotely. If you're in person, you and your mind are in work mode. Or at the very least, seem-to-be-working mode. But even with a total refusal to work, you're doing it from the restraints of the office. So you're passing time in a restricted area. Versus passing time in the comforts of your own home. There's an obvious case to be made for people who just don't manage their own time very well being impacted more from remote work, but even in the case of perfect time management, the environments alone have to make some sort of impact on your willingness and desire levels to do different activities.
What you can not currently do, you more easily quell the desire for. But if you're now home and that desire arises and you can get away with satisfying it, well now you're off to the races.
Jonathan Blow suffers from, what I like to call, "programmer isolation syndrome".
He's basically put his whole life into his own projects, and doesn't really work with or collaborate with other people. Thekla is more or less his own thing, and the few employees he has are both unnamed and temporary.
So he kind of just exists in his own bubble. Inside of that bubble is some good wisdom on programming and advice on undertaking difficult projects like games.
The thing that he doesn't have inside of his bubble is compromise. Or really any good wisdom on engineering as a whole. His positions on software security are laughably outdated, and remind me of concerns that people had about ARPAnet during the Cold War.
He doesn't understand that so much of engineering as a whole is compromising, either having to compromise with other people, other ideas, or even objectively bad ideas. What makes a good engineer is that ability to compromise effectively. A master engineer can take a worthless piece of junk and make it do something useful.
Blow isn't really an engineer. I see him more or less as an 'artisan'. If he doesn't like something, he doesn't try to fix it, or work around it, or integrate it into something better. Instead, he starts over from scratch with something that works.
While this might be good for 'artisanal' projects, like indie games or custom programming languages, it doesn't work for other kinds of projects.
I think people shouldn't take Blow's words at face value. As a software programmer, you are first and foremost an engineer, not an artisan. Becoming an artisan comes after you've gotten the engineering part down.
He worked as a consultant for like ten years if I remember correctly. Also, he has worked with many employees on his last game and his current game.
The way you're talking, is that you're saying an engineer compromises on things. The thing about making compromises is it harms the integrity and quality of the engineering you're doing. You're spot on about him being artisanal, as an artisan desires artisan-level quality for their work.
Compromised software development, is why you get your accounts compromised, as the garbage. Additionally, compromises harm the integrity of the vision. This is especially important if the vision is sensitive to small changes.
> He's basically put his whole life into his own projects, and doesn't really work with or collaborate with other people. Thekla is more or less his own thing, and the few employees he has are both unnamed and temporary.
Untrue. He collaborates a lot with people. If you've heard of Casey, who also goes by Molly Rocket on here, he's worked with him a lot. Additionally, you're falling into the trap of attributing everything to one man. This is like saying Konami is all Hideo Kojima, or Nintendo is all Shigeru Miyamoto or even that Microsoft is all Bill Gates.
5:00 Yeah, had this fun twice in my first "real job" company. Taken over a couple of code bases which were "continuously developed" by "senior developers" for months, with not even getting close to the end. Ended up throwing away of lots of overcomplicated code (YAGNI, overengineered for no other purpose than certain DRYing obsession). And finished it all in a few weeks, where others wasted man-months before.
Roblox is a pretty intuitive environment. Lua was my very first exposure to programming and now that I’m actually a developer I’d love to actually learn the language.
When I compare developing Roblox games to the learning curve of Unity, Unreal, Godot etc it feels a lot more straightforward for beginners. I’m sure it has more limitations tho.
at the 4:55 mark:
Maybe it's weird, but I actually kind of enjoy the fact that people really don't care how hard you work on something. That goes both ways, ya know? If you can deliver something people enjoy with a low effort to reward ratio, it can be a win-win. AND while it kind of sucks, it rewards one's ability to assess what people at large might enjoy, instead of a personal project that might not be as well received.
trade-offs.
2:25 I feel personally attacked after setting up a docker compose with multiple services for monitoring a single app to send a couple SMSes everyday for a colleague (don't worry I'm gonna split the app into multiple micro-services 😏)
Primeagen will never take my left-pad microservice from me!
I used 8 containers just to create a multi-channel messaging "app". I had backend, frontend-serverside-render, message-bus, certificate-gen, reverse-proxy, utility-executor, static-page-server, container-signaling-glue-logic . And those were the bare minimum, they weren't micro-services at all, just separations for technical reasons (for ex, the base of the "distribution" of the container being different, or container being different processes made in different languages).
I just built a dynamic IP updater using docker, lambda, and route53 (all custom script) so that I can access home assistant from the outside Internet.
I definitely have more services than users and I regret nothing.
All you people in this thread is why technology is stagnating. All doable on 1 or 2 $5 virtual servers…
@@lucasjames8281 my ongoing costs are $1 a month so I'm not sure what you're on about.
I think the good ideas coming from in-person meetings comes from the fact that while you're telling another person what you have planned, you can far more easily work out just how good of an idea your current ideas are, and your brain may also be working to improve on those ideas since the act of explaining an idea is like reading the recipe of said idea. With enough experience in cooking various things, you can generally understand where a recipe is going and what it might look like just by looking at its ingredients and instructions.
I think software/game dev is generally the same idea, if you explain to another the ideas the game/program will run with, your brain will often deconstruct it into what the outcome may be, and either you, or someone else can suggest an alternative that'll supercede the original idea since it's just better and why not just use the better idea after all.
I dunno, I'm not in the programming phase of being a developer yet, trying, but I like to think about these things a lot.
what does not being in the programming phase mean lol
What you just wrote is more about the benefit of describing your ideas to other people is beneficial not why in person is beneficial. That implies 1) you have good people reading skills 2) the people you are working with gives visible responses. Even with those assumption it is still mostly about the quality of the people you work with and not in person or remote.
I find ironic that Blow starts with the same old "things were simpler back then" argument. Yes, things were simpler back then because a good portion of the languages you had to use were complicated to the point only a small number of people could do those complex things. It's still complication but in a different place. It's sad when people forget to look at things from the past in the context of that time period and always default to the usual "the grass was greener" BS. And it still becomes complicated because programming is not the collaborative effort some people make you believe, but rather this group of people trying to impress themselves with their own intellect by implementing impractical solutions that don't work.
So, you're trying to say that in the past developers were competent, but now it is a beautiful egalitarian field where complexity is removed from the developers education and manifests now in the overengineered crap those egalitarian uneducated developers create? Fully agree.
@@vitalyl1327 More or less. A good example of that was when people had all their panties in a bunch when VHDL files from AMD leaked when the vast majority of developers don't even understand logic at programming level and somehow they're gonna magically understand logic at component level.
It's more complicated because the scope of projects and requirements have risen in the past decades at a high velocity. We've built abstractions upon abstractions in order to keep up with this scope creep, so now it's much harder to reason about a system because a lot of its complexity is abstracted away from you. It's a simple scale issue, do anything easy and it's still gonna be simple to solve. Do anything hard and the complexity will make you feel dizzy and ready to reach for those abstractions that have their own set of trade offs, but they allow you to move at a much faster rate.
Common Jonathan Blow L.
I take it you don’t like Mr. Blow
On the topic about productivity in a WFH setting; I am happy to say that I've worked both in 3D graphics and web/cloud development. I have not worked on games per se, but on the engine team of a big in-car navigation provider.
My productivity when working on the engine team would skyrocket at the office. When the job required very technical or math inclined knowledge, even writing with a pen on paper would yield better results for me in the office because I would have people to bounce ideas or brainstorm with. The knowledge sharing was insane in an offce environment, and I really needed it.
When working in web/cloud development, I didn't have this need for knowledge sharing (but I do sometimes miss being at the office for the coffees and cigarette breaks). If I code for 3 hours a day I can say that it's been a long day even though I complete everything on time. If I had been working at the office I would probably be even faster because I wouldn't stand up every half hour to look out the window or make a trip to the kitchen, and when I'd be done with my work I'd go outside the scope of the sprint and finish other things as well.
Anyway, I feel like game programming is closer to the first scenario, the insides of the industry and the random knowledge sharing that happens on-site really boosts you. I also feel like that in game development, the company culture is also very important. Imagine working on GTA and programming the car bouncing when the player hooks up with a prostitute. Doing that at home is cool, but doing it in the office with your slightly perverted coworker standing next to you laughing his ass off is amazing.
Game devs nowadays don't give a fuck how Vulkan or OpenGL works, they don't even care how the CPU or the OS works, and I'm not convinced that is bad thing, most people are just fine learning how to work within their game engine, which usually is one of the 'off-the-shelf' ones like Unity, Unreal or Godot.
yeah game devs are focused on making games. shocker.
Correct. Idgaf how Vulkan works and if a single test fails when we flip the V switch then it goes back in the drawer. You used to put 10,000 hours in to achieve mastery in game dev. Now you'll need twice that to master tweaking foliage parameters.
honestly how all fields work, as the tech increases, the people who know enough to really know the nitty gritty become far and few between. As an amateur game dev, the software I currently use is... unity, visual studio(c#), vscode(json), photoshop, maya (modeling, UVs, skinning, rigging), substance painter, zbrush, git, and within unity there's skills to consider like shader and vfx creation, which could be their own entire programs. And I'm probably missing something. I really don't need more on my plate, give me an engine that works out of the box and I'll be happy. The amount you need to know for a basic low fidelity 3D project is insane, I really pity anyone venturing into this who didn't learn this stuff as the industry progressed, this is why asset stores are so popular
Well, duhh. They are focused on actually making the games.
A company like Dice had a whole team exclusively for developing the Frostbite engine. The game devs themselves focused on making the actual game. Frostbite is an exceedingly complex engine designed to be the workhorse for dozens of different games, so having its own team makes sense.
I go to a game programming college and im in my second year/3. most people do genuinely like to stay on just using engines and try to make games on them. Its fun and there’s always the hope of your game popping off. but there are still people who love learning all the low level details. I will be honest in my courses about openGL we barley learned anything, we are basically given a framework and learn how to play around in the framework without really writing much openGL code, I know next year I have a vulkan course and I will see how that goes. but most of my openGL and valuable c++ learning came from self learning rather than school, I really had to go out of my way to learns some of these stuff an was shocked that there are so many concepts we never even discussed in class. I think its mainly the reason a lot of people never get into it. Personally though i love working on everything lmao, lately I have been getting into openGL a lot more though as I got very comfortable with unity and unreal. I will try to “master” openGL before i get to my vulkan course so im ready for it then. but people who want to get engine programming jobs will have to know these stuff to heart, most people who stick to engines will probably be gameplay programmers and similar roles which are completely perfect and fine to do.
I understand the idea of hard work not being the goal in itself, and I agree, but I also think that results are also not a metric for success all the time.
Sometimes you need people who will try very hard and fail, because someone working less hard would eventually fail too, but after many more months.
Or maybe it is an indication that there needs to be a change in strategy.
Negative results are also results. I think that people too often only consider positive results.
Sometimes you work hard, you did the best that was possible at the time, and the feature that you created sucks. Whether it is a success or a failure depends on the context: Maybe you did a bad job completing the task, or maybe the task was wrong from the beginning. Or maybe you demonstrated the ability to work hard, and you were just given a task that was not your strong suit, but maybe this other one suits you better.
Without proper context, it is hard to evaluate in absolute terms.
Jon Blows
17:54
saving this to come back to when I start learning about HTTP before my Computer Networks study session soon
"How many people have more microservices than customers? ... Throw that in a monolith"
Careful, you're starting to sound like DHH and the people who like Ruby on Rails for that exact reason lol.
You're gonna get us all in trouble haha. When I first heard about HTMX that was my exact reaction 🤣
I definitely feel the same with remote work. It's very convenient but there's always something missing that makes me engage less or zone out during the conversations compared to in person meetings.
sorry ThePrimeTime, but this remote video format is not communicating your thoughts very well. Perhaps you should host the streams in person.
DefCon perhaps?
"It doesn't matter how hard you work." TRUTH
I hate this mythology of companies always choosing the best for the customer so much.. how can you spell this out while talking about the situation where companies literally hurt progress because they are busy hoarding money
23:26 - ayooo, shout out to Inifinity Wars. Nobody plays it, but I still love that game.
I am a software engineer but not in the gaming industry. To me WFH affects work a lot because communication is very different imho. It's not necessarily about meetings but about the fact that it now make people only communicate when there is stronger purpose. We feel bad about calling coworkers just to share a random idea, feeling, discovery. When in a office, we just do it. It creates a way more utilitarian ambiance and less creative, less innovative. That's my experience and I started WFH 2 years before COVID, I am happy to now be back 2 days a week in an office to work with my colleagues IRL, cracking jokes, trolling and sharing random thoughts about what we do without having to feel I disturb/distract them from their task. I can naturally see when it's a good time to talk.
My coworkers and I used to sit in Discord all day and chat, listen to music, and work. It was great but I doubt people would be willing to do that in most companies.
I wish i could've been there to watch this live because around the 24:00 mark i could've interjected a bit of information about that game and company behind it, also its from 2010 and i never thought id see it again let alone on steam and in a primetime video
I don't have a lot of pro game dev because that's not my main stuff but the code quality and standards are way worse in the gaming industry than normal backend or even web dev.
What they don't realize is that it actually make the devs actually way slower than developers with better standards and code quality
Dev time sure, but not speed. A lot of game dev is hacky and unreadable as shit because its fast. IE: fast inverse square root. Games ultimately care about code execution way more than readability.
Try to code a game in "clean code" python and tell me how that goes for you LOL.
@@JathraDH you missed the point (or trolling ?) I meant game dev time. Faster devs means more features or a higher quality game (if they care about their job that is)
@@captainnoyaux Gamers do not want a game that comes out in 1 year and is full of bugs and runs like shit. They would much rather wait as long as it takes for the game to be done and release finished.
You really sound like you aren't a gamer at all. Who is trolling here?
There are 2 big things that make it much harder to write "clean code" in game development.
1) You usually don't have a very clear idea of what the game should be when you start making it, much moreso than other software. You usually go through a ton of different iterations, and see what does and doesn't work well - features get added and removed all the time, and the thing you're trying to build is a moving target which makes it hard to plan out in advance.
2) On average games care a lot more about performance, which of course means that when there's a tradeoff between maintainability and performance they're a lot more likely to pick performance (depending on what they're working on of course).
It's also pretty difficult to test any part of the game in a vacuum so to speak - unit testing is very difficult because nearly everything in the game is interconnected and you can't really test one part of the game without setting everything else up too a lot of the time.
@@asdfqwerty14587 Basically what I said but elaborated upon. Thank you.
I second that all of my best ideas have come from grabbing coffee with a fellow engineer. You grab a napkin or a white board and start drawing lines and boxes... And then someone erases one line and says "what if..."
I like your videos, but what kind of sociopath overlays himself on top of the author of the video of they're reacting to? 😂 It's like he's peeking over your shoulder!
Honestly I'm so bad at this.
Please forgive me, I do move myself later on
@ThePrimeTimeagen Haha, it's all good, it just looked so funny! Maybe future streaming programs will be gpt integrated, send a frame or two across and get an overlay color and position back in json. 🙂
My experience of working remotely is 2 things:
1: A feeling of isolation: This can happen more so in dev because engineers are often "left to get on with things"
2: Inefficient communication: When communicating in person there are so much subconscious information, body language etc. that you get from in person that when you communicate remotely this is either missing or very difficult to pick up on
its a fairly free market. im not saying its easy but anyone can be a competitor
12:15 you underestimate how bad some people are at communicating in person (those ain't your netflix programmers). Text messages give them a little more time to think and you'll get actually better results that way
Prime : " *_WHERE'S MY FREEDOM PHONE?!_* "
Librem 5 USA : " Allow me to introduce myself "
Copyright and patents only exist because of the government, they do not exist in a free market. The law is pretty explicit in saying they are for monopolization of a product, and if you ask me copyright and patent law dose not help incentivize invasion.
Consumer choice isn't a thing. Same argument as "Americans are fat because they choose to be." If consumer choice was a thing we would have to suppose no monopolies, which isn't how capitalism works, because if there is only a few companies monopolizing a sector, they can dictate price and consumer choice, that is to say, you have the illusion of choice.
And on the dichotomy, the government should work for the people but it works for capital so saying "capitalist government or personal choice" isn't a dichotomy, it's the same thing, both dictated by the monopolies that control the government.
Also I would much prefer a democratically elected government be given a monopoly rather than a bunch of despotic capital owners with diametrically opposed intrests to 99% of the population.
"Apple isn't a monopoly" oh my bad they are apart of a oligopoly of 5 companies.
About that "more microservices than customers" is quite interesting. There is this game that blew up, Helldivers 2. The first one had peak player count of 10k, the second one got up to 400k in a week. And they didn't design their backend to be THAT scalable, and now are fighting to keep it working. I can't wait for some kind of a blog post from Arrowhead about that. They are hardcoding hard cap for total player amount, because they cannot just "buy more servers".
Also, the game is so good, that even if people have to wait 10-15 minutes to get in, they still do, and as soon they increase the cap by another 100k, it gets filled in one hour. Defeated by your own success :D
The struggle of dealing with endless layers of frameworks as a frontend developer is too real. Let's bring back the days of writing pure JS!
Spam bot. Report it
First off, obvious bot account with bot upvotes. So annoying that people can push whatever BS narrative they want these days and gain visibility by using bots.
But since you are now the top comment, I'll address your point. JS does not scale with modern app expectations. It's only viable if your "app" is just HTML with a little JS sprinkled in, and not very dynamic. Otherwise it turns in to a mess of spaghetti. or you end up creating your own framework. And yes, we can fantasize about going back to that, but it's not going to happen. Honestly the only place ever see that is in the r/webdev circle-jerk anyway, or with newbies who are intimidated by frameworks and don't understand their purpose.
Start thinking like a programmer and you won't even have any issues. Yes, npm is crap, JS is crap, and the ecosystem is overwhelming, but it's really not that bad considering how much crap programmers have had to deal with for 40 years.
9:00 Games haven't gotten worse.
AAA games have gotten worse.
And that is because they are a pure transactional product. They are done with the safest checklist to guarantee economical success.
You cannot make transactional pieces of art, except for when the art is just a money laundry scheme.
AAA games do not have a spark because they are not done for gamers and artist to the fans. They are organized by suits who have very good studies on what is the maximum price that microtransactions can be before people started to revolt according to market polls.
The Witcher III and Baldur's Gate III are two examples of games done by smaller companies, with smaller budgets, but done with the intent of being good games first, and let the art attract people who will pay money for that art.
But it is the same with big block busters. Now especially with Marvel movies, which are basically exactly the same movie but with a different skin on top, changing the locations and the characters, but exactly the same movie otherwise.
When the goal is not to make a good movie but "This movie costed 300 millions to produce and another 300 to promote, so it either makes 600 million or the studio will go bankrupt", then people play it safe. And safe is boring. Safe is the same thing done before because before it work. Safe kills the spark of imagination and innovation that is fundamental for a good piece of art.
I think that Netflix and other streaming platforms are doing a speed run of this. They all have had very bad shows, but every now and then they produce a "modern classic". And that only happens because there was no fear of trying new things.
However, as shows have gotten more and more expensive, they are suddenly going back the movie route, and staying safer and safer.
Gaming companies went from "those nerds who play video games as adults, ahah" to "actually they make more money than films?? WE NEED MARKET ESTIMATIONS NOW!!".
It is simply a change of priorities from the top down. It has nothing to do with working from home.
"Current government bad therefore governance bad" is middle school tier Libertarian baby bathwater tossing
Noooooo we just have to voot for the good guys this time!!!!!!!!!1111
All of my greatest ideas, and by far my greatest work, have come while home working alone, or after everyone left the office at night. I don't even know what Prime is talking about here. I can't even fathom it. I've never had a meeting where I walked away with anything I could use, unless I was coding with a single person at a computer, bouncing ideas off each other at speed. Any group meeting? Nothing.
4:33 Your point is way more valid than you probably think, it's like maximally valid. One example of hard work that nobody properly appreciates; women that take care of the house and the kids. They work all day. They work hard. Is their work excruciatingly vital to society? YES. Are they paid? ZERO. NONE. Are they the source of all life? YEP. Why are we not paying them for it? COMPLETE AND UTTER NEGLIGENCE AND BLINDNESS.
Taking care of your own house and family is not a "job" who deserves to be paid, unless cleaning my own room and take care of myself is a job too. That comparison is a consequence of hating to do simple responsibilities and housekeeping. I mean, I need to care of myself in order to be useful for society so "Is my excruciatingly vital to society?" YES.
I love remote, it allowed me to have a life. I didn't have it at all before. But. There is something about sharing space with other people in the same mission that helps. I still feel 400 times more productive at home.
First 25 seconds made me lol already
I love watching your videos. Your takes on things are spot on. Keep up the good work, I love to see it!
"why does my cake have a 500$ red dot on it? anyway..."
I'm in enterprise web app development. I've been working fully remote for about 9 years now. It's not for everyone, but it is for me. Some people want all in office, I think most will be good with some manner of hybrid. I've worked at remote first companies before, and you really do have to attract the right people and build the right culture to make the most of the advantages it gives you. Additionally, you have to be okay with letting go of the wrong people as and when needed - and to not dilly dally about it.
One chatter mentioned that in-office is necessary because some employees need to be "watched" to be productive. I'd simply counter that by saying if you have an employee like that, in any situation, then you really should work on having them _not_ be your employee. Particularly at the level of a skilled craft that SWE is considered to be.
Just use Game Maker lol
even better, just use love2d
About remote communication - it’s good to have an established baseline in person before going full remote or mostly remote and when remote put in the extra effort to be explicit.
johnathan blowjhob doesnt recognize that the problems people solve today are harder than yesteryear.
@@skellington2000Hardware and software doesn't get faster and scaler better by getting simpler. Ivory towers aren't reality. And yes the problems people solve today are genuinely harder. There was no need to serve webpages to millions of clients in the 90s. Realtime raytracing was impossible in the past, no thanks for simplicity. Transformer architectures are more complex than the simple linear regression models of the past. Etc.
They are not. Most of the problems code monkeys solve are trivial, but code monkeys cannot think, so they make them orders of magnitude harder than they should have been. Code monkeys cannot build things from scratch, they combine dozens of third party dependencies and then they're lost in tacking the emergent complexity of the Freankestein monster they created of a pile of necrotic pieces.
@@thomassynths lol, you're funny. Raytracing is algorithmically *simpler* than the traditional rendering. Modern unified shader GPU architectures are simpler than the old specialised pipelines GPUs. You clearly have no idea of what you're talking about.
@@skellington2000 try doing it fast in real time using your naive code
@@vitalyl1327 I do know a bit of what I’m talking about. I wrote CAD software for a decade and wrote GPU mesh emulation for Apple’s Metal shading language.
Prime, it would be beneficial learn a bit about Vax and DEC. OpenVMS was ground breaking for distributed systems. A datacenter was destroyed in 9/11 and the cluster stayed up.