► Thanks to ProtoArc for sponsoring! Grab the ProtoArc XKM01 CaseUp travel solution from their official website and use code “UFD25” to enjoy 25% off early for BFCM at www.protoarc.com/products/xkm01-caseup-combo?ref=UFDTECH #ProtoArc #ProtoArcXKM01CaseUp #ProtoArcFoldableKeyboard
to be fair - AMD kinda messed up w gaming - they should of kept the 5800xt/ 7800xt3d in stock at a lower price and - recall the 9000 series. Im still laughing on a 7000 series.
@@alandajonesFair, but i wouldn't consider it a collapse since they're focusing on the biggest demographics of gamers, who mostly play at 1080p and 1440p, which makes those graphics card great in that regard, just like with the RX 400/500 series as well as the RX 5000 series.
@iequalsnoob That was true a decade ago, but it went downhill after the release of the Ryzen 1000 Series, nowadays Intel is pretty much failing to the point of rumors about other companies considering to buy the company, how the mighty has fallen.
@diomedes7971Doubtful. Console sales only account for something like 10% of their current revenue. Considering the revenue from consoles has been declining each year (typically these contracts have a yearly part cost decrease so that’s to be expected) I doubt they make much revenue off them anymore. Most of their income has historically come from their data center and embedded divisions.
Its not the industry losing, its the economy collapsing. If people don't have money to buy, then they will not buy anything, nvidia is more like a necessity for modern computing so the most richest people that have money will still buying them, everyone else? Holding on to whatever you had and withering until the economy stables.
@@MoonBunnyLovers weird time to buy? the quarterly earnings were the reason to buy, EPS was postitive and earnings were positive just a shame there wasnt more i suppose
It's funny hearing about Apple's memory bandwidth "improvements" this gen, considering they cut the M3 Pro's bandwidth down 25% from the M2 Pro. Now, they're touting a 75% gain over M3. Sounds to me like they never needed to cut in the first place.
Just like Crypto and NFT, it will die down. Most of it is just Hype, and only a few companies like Nvidia will benefit from the craze until something revolutionary comes along.
@@GurksTheGamer there is nothing "we shall see" it had a real use case which millions of companies are incorporated now and all their users want. there is nothing about "we shall see" about this.
The problem with not having usb A is that you can still buy external storage off the shelf thats usb a for a fraction of the price of what's usb c out of the box. Even cameras like gopro ship with a usb c to a cable. Products need to support usb c computers. And computers need more usb c.
This. There are so many options that presume usb A by default that it is not even funny. I have a recently built system, it only comes with 2 USB C ports, while it has tons of usb a. Same with products to plug into those ports.
I've always said it this way: if AMD wants to get some marketshare, they need to not target "gamers", as their cards are already good for that purpose. They need to target "other users". For a small example, NVIDIA has a stranglehold on streaming and VTubing: a lot of tools actually only support NVIDIA cards and even when they have AMD counterparts, the NVIDIA ones tend to just run better overall. This is what "forced" me to grab a 3080 Ti when I actually would've been okay switching to an equivalent AMD card and save a ton of money there.
I think that AMD's gaming sector is just a reflection of the general consumers wallet. I feel like most people that buy AMD are mid to low income, which is the demographic that's hurting the worst right now. In the US anyway.
it is how gaming market in general. in the past even if GPU sales are low console sales will save the revenue income for AMD. now even on console space gamer are not buying much. such thing also affect nvidia but since nvidia have CUDA, AI and properly support their gaming GPU with those feature they able to counter the low sales from gamer by those buying those geforce for work related purpose.
People owning PC with AMD processor are mostly low to mid income budgets, they thought Intel would completely crushed by AMD forever, but the truth is AMD marketshare doesn't even hurt Intel at all, Intel still dominating all of x86 markets. Intel's new fabs are still constructing and not finished yet for building future generations of processors, that's why Intel have no other choices but temporary working with TSMC chips until their own new fabs is complete.
I understand the hate for generative AI artwork use in a profession setting, but that's about it. I use multiple AI models to create tables out of PDF documents and alter data into readable formats all day, and holy cow it saves so much time.
@@arenzricodexd4409 ROCm still isn't implemented well with other software. I used Blender and the only way to get it to render with my GPU was with ZLUDA. Until a normal consumer can buy an AMD GPU, download the normal drivers, and run every software nVIDIA can, they will stay behind.
I think he meant they are generating code that then has to be revised by humans and fixed. So it's probably a lie. I remember them saying all the audio was trasncripted by software and then people figure out they had a lot of people for cheap working on that somewhere around the world. You don't even have to explain what you are wearing, that's right, I don't care, but reese look so sweet actually.
7:57 Maybe reconsider picking the monitor up for hdr since the Q&A on the product page says it only has 384 dimming zones which might not be enough for decent hdr
It has a 1000 nits. If the 384 dimming zones are implemented well, it's fine. I'm not making a review here, just pointing out it could be good, just as easily as it could be disappointing.
I agree. My 5900x has served me well for the last 4 years, and will continue to do so for several more years to come. No point in upgrading every year, especially when it’s very obvious we’re in the middle of a transition period right now. Chips in 2-3 years will be very different than they are now with everything becoming AI
@ MrIdiot The 12900F is the OEM SKU. 100 MHz less than K on Turbo / Boost. The Alder Lake SKUs do just as well as Raptor Lake after you take the steps necessary to minimize the Raptor Lake defects. Also, no idea where you found that price. Even 12900K can be found around $200 USD new.
Well... if the machine was trained ,yes is A.I, if it was not trained then is a normal machine. In any case the machine did a better job than you since she decided to keep the machine instead of you
Reese should be promoted to Co-host in my opion , you guys work awesome together it could be done even when reese goes back home to south africa , please consider it, love you guys
The likely reason those CPUs didn't sell is because the retailers weren't actually selling them. They have the listing, but no stock. Same in my country, I can see the listings for a 285K for 635 or 645 euros at different sellers and none of them have stock.
as a dev i use a lot of ai in my work but mostly as little bits, i don't generate entire program but the code completion from copilot definitely saves me a loooot of time bc i'm a slow typer, it's never things i wouldn't have written myself and never more than 2 or 3 lines but yeah we use AI and it does boost productivity if you don't use it to do all your work just to debug it later because it doesn't work
I use co-pilot at work for code and I do find it quite useful. And I do find it quite useful It depends a little bit on what you mean by writing the code though.... It's more like a really good autocomplete. Like I need to write implementation of an interface that I've already written different versions of like four times, so it'll make that way faster. Maybe it'll help me write the test. It's definitely not like I'm telling the AI to go do something and it just goes and does it. It would do a terrible job at that
I work in telecoms and I cannot use chatgpt or other MLMs as a pricing manager, sometimes I would throw a complicated formula at it but generally speaking I use it more privately
Yeah, pretty sure that's what the commenter meant. And i was just being a jackass with the dongle comment 😉I'm satisfied with lots of USB C ports honestly
I love my base Macbook Air M1 with only an upgrade to 16 GB RAM. It's the perfect light work machine. Now that all of the macs get a minimum of 16 GB, i'm likely to actually buy a mini or poke the people around me who have intel macbooks to finally upgrade.
I'm a janitor/handyman. So... I'm pretty safe unless companies want to buy an automated zamboni thing and then neglect everything else... Oh god I'm screwed.
Too much selection. Like cars. Tesla builds 5 SEXY, Cybertruck. That is all. Everyone knows what they need/want. AMD needs to do the same. Just a few choices depending on what you need.
I work as a mental health professional so I don't want to speak too soon but I don't think they can replace us with AI. Unless people who struggle with mental health related problems don't care if they are talking to a real person. If so, I may be in trouble.
Sorry Brett, you get a 👎 for saying Apple more than three times. It's like Beetlejuice, you'll have Tim Skeletor Cook showing up, and that's just trouble.
My dad is a computer programmer and they use ai to generate code, then all they need to do is proof read it and it speeds up the coding process by a huge margin.
I never leave any part of my programs or scripts to AI. I find doing it myself easier than telling AI a series of actions and conditions it has to fulfill using code. The only time I use AI is so I can understand a concept that I don't know how to work with, in which I'll learn the process thanks to the AI and then do it myself, knowing what I previously didn't know before.
The only thing that keeps me from going Apple atm is not having a 2 in 1. As an artist I need a laptop and a tablet and I do not want to carry multiple devices (being it a tablet and laptop or a laptop and a pen tab.) Also I really am not a fan of iOS. Until then I will enjoy my Thinkpad.
I just hope AMD remembers that a lot of their current growth comes indirectly from the gaming innovation and that Nvidia wouldn't be nowhere without it.
maybe for nvidia but not for AMD. in fact buying ATI is considered one of AMD biggest mistake in their history that eventually lead to a decade long financial woe for the company. if anything gaming almost ruin AMD.
@arenzricodexd4409 but now it's one of the biggest strengths. Without AMD buying ATI, we would not have great APUs, especially in laptops. Mobile market would continue to stagnate. Instead, it's evolving fast. That may not be true for desktop CPUs but given desktop PCs are niche between servers, workstations and laptops, I don't think they have to worry
@@arenzricodexd4409 AMD iGPUs are still an important differentiating factors VS Intel CPUs, that's why they got the console market, and why they successfully entered the handheld market, and in the era of AI, that GPU expertise might turn into a cash cow if they play it right.
@@grandsome1 yes they got that market but my point is not matter how good AMD GPU is console and handheld market can't really generate big revenue. and the last few quarters AMD gaming revenue are getting worse. usually AMD able to get something like 1.5 billion on average in every quarter. worse maybe it was at 1.2 billion. but Q1 this year the revenue drop to 922 million only. people thought that was bad and yet the revenue continue to drop in Q2 (648m) and Q3 (462m).
15:49 if you can afford that computer, you can afford an adapter to turn the Thunderbolt 5 port into USB A or get a dock. Lose a little function? Maybe, but that's happening with whatever device it would be anyway.
Running a 12600kf, was running a R7 5700G (changed due to audio issues thinking it was the APU causing issues) . Honestly performance wise, I hardly notice. As always go for the best price for performance. Wont be upgrading CPU for another 3/5 years do want my 3060 swapped out for a 5060/5070 tho
If people weren't excited enough to buy RDNA3, I don't see RDNA4 being much different since they're going to be missing out on the high end. I have to wonder if they're even going to have an 800 series or if this is going to be like RDNA1 and the best we get is the 8700 XT. I think they're going to have to have some impressive increase in perf to get people excited to try to shift that number the other direction. And it's not like the PS5 Pro is going to sell a ton of units since that's only going to appeal to a niche segment of console gamers.
@@TheGameBench because nvidia gaming GPU are much more useful than just gaming. don't be surprise if 70% of geforce buyer are not gamer at all. now even companies are buying nvidia GPU for their official workstation solution non just semi pro. in china for example their cloud compute provider end up buying 4080 and 4090 modded them with more memory instead of buying the significantly more expensive H20. that's why there are GDDR6X shortage causing nvidia to release 4070 with regular GDDR6. AMD saw nvidia success with this hence why UDNA happen.
@@arenzricodexd4409 I'm only speaking to their gaming GPU's. I'm not even referring to their workstation or AI cards. That being said, AMD GPU's are useful outside of gaming as well. They trade blows with NVIDIA in productivity workloads. It really depends on what you're doing.
I play around with AI code generation, but have NEVER straight copy pasted from chat GPT to a project. It is great for drumming up ideas or alternate approaches though.
16:44 a good thunderbolt dock costs over 100$ where I am from and I need to buy it to use most of my accessories even a mouse or keyboard. It's not people being stupid and not accepting USB c it's not wanting to make a separate purchase just to use the Mac mini I would have to spend minimum 1000$ on
Dude. AMDs video cards have never been better. I bought a 6950xt and all my homies switched this year from team green. The drivers are awesome and the performance is great.
@arenzricodexd4409 and? I'm simply saying it's sad more people aren't aware of how good their cards are right now. Their drivers have come so far and overclocking is such a nice process being native to the driver.
@@backwoods357 and? The news is about AMD gaming revenue which is usually 70% of them coming from semi custom chip (console, handheld). It is not an issue of low sales because AMD gpu performance
@@arenzricodexd4409 in the video he addresses the console situation, as well as talking about the gaming cards. I never said the issue with their sales numbers is entirely or even largely related to enthusiast GPU sales. I merely wanted to point out how good their cards are at the moment as well as how far the drivers have come. A lot of people just blindly buy green because they are conditioned to think it's the only option.
11:00 Only in gaming... When you can 10x your profits from moving from gaming to AI customers.... you kinda have to... AMDs biggest problem is that CUDA rules and the open source framework version is just no where near as good. It will take a few more years to catch up and a few more to get people to use it as preference. AMD next need to get in on the ARM cpu / gpu... Qualcom is having licensing issues... Some AMD magic on some ARM ip... could be some real contenders. MS is in the way though with their lacklustre minimum effort ARM offerings of windows. Windows has borked. A global system registry? lolz self contained apps? lolz. Could all be done so much better with a new approach. How about UI consistency? How about just 1 way to do configs? Windows needs to split. Classic & New. Classic for people who 100% must have LTS. New - for customers who need cheaper to own / run / administer machines.
So big tech is laying off in droves while expanding their data centers. Explains why gaming software/hardware sales have dropped. We broke as hell dawg....
Much of the nvidia "consumer" segment is still related to smaller data centers buying consumer gaming GPUs. Remove that aspect and nvidia is doing worse than AMD.
no one is going to buy anything until black friday super sales, especially when we are all waiting for new stuff at ces next year which is... 2 months away
AMD was doing GREAT in Q4 of 2023. NVidia was spitting BLOOD and selling NOTHING in the midrange. At Germany's "MindFactory" AMD was outselling NVidia in both UNITS and in AVERAGE SELLING PRICE!! In January NVidia was forced to lower prices by 20% - the 4070 Super = 4070 Ti, the 4070 Ti = 4080--, and the 4080 Super = 4080 - $200 in price!! With these changes, NVidia turned the tables on AMD and made the 7900xt, the 7900gre, and the 7800xt irrelevant and therefore, AMD has been hurting for sales since February of 2024.
It would seem that nobody is buying an MS or Sony console by the looks of things which really isn't surprising considering there price. AMD needs to drop prices considerably instead of trying to match nvidia on price and they also need to get there power usage down.
It's funny, I thought the new intel CPU's were sold out in the US because of the whole thing about them being the first desktop CPU versions to have an integrated NPU, but I guess it was because of the very low stock on the shops 😂
im both weary about ai being just thrown onto mac and ios without abandon and really pleased that base model macbooks pro, air and mac mini's are the best bang for buck for productivity and general use 16gbs of ram across the board is great
i know that theres the walled garden apple has wrapped around ai models to do them on device but even then image gen and agent models just make me uncomfortable top to bottom
9:00 Why didn't Intel take the opportunity to shake things up and release an iGPU that soundly beats a 780m? If they knew their CPU was nothing 'special', why not give people a reason to hop on the train?
You know, I can't switch from the Steam Deck until any one of these "more powerful" handhelds have touchpads, it's just way too useful. I just can't play fps games with sticks, is that an unpopular opinion?
Next quarter will be much better for AMD. A lot of people held off buying anything whilst waiting for Zen 5 and Arrow Lake (myself included) and now that both flopped, last gen AMD CPUs and AMD parts in general are flying off the shelves. I finally built my PC last month after seeing Zen 5% due to the 6 free games bundle from AMD and cashback from ASUS.
A single USB-A port(and SD slot) is all the Mac Mini needs. I have 3 things i need to plug into my Mac My network My display And my cheap $50 dock so i can connect my keyboard, mouse, and card reader, and no i refuse to use bluetooth Honestly, this is more rediculous than marketing this for photographers and not including a fast SD slot like the Mac Studio Apple is all about taking the research out of the buying experience, and now you expect your users to research which external card readers are good enough for their needs? Honestly been mad about the removal of this slot, its so easy to have 2 chassis, one with and one without if its an aesthetics thing iMAc though, why wasnt the SD card reader on the bottom of the screen of those models. At least they moved some ports to the front, absolutely love that, but now that the power button is on the bottom, why isnt it on the front. Heck Apple, you're so keen on putting important things on the bottom, why not put an SD Card reader there, so that when there isnt a card, you cant see it, but when you plug in the card, it sits flush with the front of the aluminium shell
Laptop keyboards kind of suck and you can't put the monitor screen at that angle and use the laptop keyboard at the same time. Laptops have track pads too but I'd take a mouse over those anytime.
► Thanks to ProtoArc for sponsoring! Grab the ProtoArc XKM01 CaseUp travel solution from their official website and use code “UFD25” to enjoy 25% off early for BFCM at
www.protoarc.com/products/xkm01-caseup-combo?ref=UFDTECH
#ProtoArc #ProtoArcXKM01CaseUp #ProtoArcFoldableKeyboard
🍓
to be fair - AMD kinda messed up w gaming - they should of kept the 5800xt/ 7800xt3d in stock at a lower price and - recall the 9000 series. Im still laughing on a 7000 series.
I will not.
WTF ARE YOU WEARING DAWG 😭
Hot nacho cheese wizard
Crawfishes? lol
humanz skin suit
He's a Dorito 🤔
RED ROCKET😂❤
if AMD is a Massive Collapse, then what the hell is intel then
😂😂😂😂exacto no se de que hablan ellos x q literal AMD is on fire 🔥🔥🔥
Could be more on the GPU side and why they might be dropping high-end GPUs. That would be my guess.
@@alandajonesFair, but i wouldn't consider it a collapse since they're focusing on the biggest demographics of gamers, who mostly play at 1080p and 1440p, which makes those graphics card great in that regard, just like with the RX 400/500 series as well as the RX 5000 series.
intel controls most of the cpu market. cope
@iequalsnoob That was true a decade ago, but it went downhill after the release of the Ryzen 1000 Series, nowadays Intel is pretty much failing to the point of rumors about other companies considering to buy the company, how the mighty has fallen.
If AI meant Absolute Idiot, then nearly all work done at my workplace is being done by AI... Unfortunately I am the prime contributor
PCMR are one hell of echo chamber
Most People doesn't mind AI as long that it isn't AI Art.
Are you automated AI?
@DeepThinker193 being an idiot is pretty much automatic for me
It can also mean Another Indian
@@dqskatt AI is freaking awesome ! Once AI can replace our braindead corrupt government, we will ALL be better in our society.
So the whole industry except Nvidia is losing? Just great...
Maybe as far as the gaming market goes, but other than that AMD just reported a quarter with record revenues and a very healthy profit.
@@swdev245AMD still far from nvidia
if the pc market goes down, everyone goes down, except nvidia, because of ai stuff
@diomedes7971Doubtful. Console sales only account for something like 10% of their current revenue. Considering the revenue from consoles has been declining each year (typically these contracts have a yearly part cost decrease so that’s to be expected) I doubt they make much revenue off them anymore. Most of their income has historically come from their data center and embedded divisions.
Its not the industry losing, its the economy collapsing. If people don't have money to buy, then they will not buy anything, nvidia is more like a necessity for modern computing so the most richest people that have money will still buying them, everyone else? Holding on to whatever you had and withering until the economy stables.
Well i helped AMD this quarter by buying an Rx 7900xt
I did my part and got 7800 XT
I did my part and bought a 7700 XT
@@veltriixguys. Lets not shill for a big company. I am Amd gpu too but i dont really give a fuck besides the better value.
@ I agree, I had been nvidia since 2010 but this year I couldn’t do it, I had to buy the better value and I don’t care about RT
Cringe amd fanboys jesus christ
I got so effed by AMD. Literally at 4:10 PM I bought $50 in stock. The crash happened at 4:18 PM XD. I lost a whole pizza worth of money! I'm ruined!
Weird time to buy isn't it? Unless you were expecting the announcement and hoping it would go up instead because of it?
Time for some vegetables
😶
It will come back. Let it ride.
@@MoonBunnyLovers weird time to buy? the quarterly earnings were the reason to buy, EPS was postitive and earnings were positive just a shame there wasnt more i suppose
Did they expect sales to not crater when they didn't release any new gaming products and didn't drastically lower prices during the quarter?.
It's funny hearing about Apple's memory bandwidth "improvements" this gen, considering they cut the M3 Pro's bandwidth down 25% from the M2 Pro. Now, they're touting a 75% gain over M3. Sounds to me like they never needed to cut in the first place.
Next to 'enshittification', there is the 'encheapenization'.....
@@GroggyGrognard lol the power couple for large companies.
Term Ai has become such a fatigue now
Just like Crypto and NFT, it will die down. Most of it is just Hype, and only a few companies like Nvidia will benefit from the craze until something revolutionary comes along.
@@GurksTheGamer nope, it isn't nearly the same as NFTs or crypto, because it has genuine usecases. Still annoying though, terribly annoying..
@@yassir-5605 we shall see
@@GurksTheGamer there is nothing "we shall see" it had a real use case which millions of companies are incorporated now and all their users want. there is nothing about "we shall see" about this.
@@onthegrid6933 give examples of such use cases
The problem with not having usb A is that you can still buy external storage off the shelf thats usb a for a fraction of the price of what's usb c out of the box. Even cameras like gopro ship with a usb c to a cable. Products need to support usb c computers. And computers need more usb c.
This. There are so many options that presume usb A by default that it is not even funny. I have a recently built system, it only comes with 2 USB C ports, while it has tons of usb a. Same with products to plug into those ports.
I think Reece just lives there now. In his Elmo outfit. Eating a giant flaming Cheeto.
Brett: checks watch
Me: he's actually wearing a watch!
I've always said it this way: if AMD wants to get some marketshare, they need to not target "gamers", as their cards are already good for that purpose. They need to target "other users". For a small example, NVIDIA has a stranglehold on streaming and VTubing: a lot of tools actually only support NVIDIA cards and even when they have AMD counterparts, the NVIDIA ones tend to just run better overall. This is what "forced" me to grab a 3080 Ti when I actually would've been okay switching to an equivalent AMD card and save a ton of money there.
No wonder Google eats so much ram, ai tends to make really unoptimized code
Brett Looking like a Hot Cheeto Pharaoh @13:20
I think that AMD's gaming sector is just a reflection of the general consumers wallet. I feel like most people that buy AMD are mid to low income, which is the demographic that's hurting the worst right now. In the US anyway.
it is how gaming market in general. in the past even if GPU sales are low console sales will save the revenue income for AMD. now even on console space gamer are not buying much. such thing also affect nvidia but since nvidia have CUDA, AI and properly support their gaming GPU with those feature they able to counter the low sales from gamer by those buying those geforce for work related purpose.
People owning PC with AMD processor are mostly low to mid income budgets, they thought Intel would completely crushed by AMD forever, but the truth is AMD marketshare doesn't even hurt Intel at all, Intel still dominating all of x86 markets. Intel's new fabs are still constructing and not finished yet for building future generations of processors, that's why Intel have no other choices but temporary working with TSMC chips until their own new fabs is complete.
I understand the hate for generative AI artwork use in a profession setting, but that's about it. I use multiple AI models to create tables out of PDF documents and alter data into readable formats all day, and holy cow it saves so much time.
AMD will have a big jump in consumption if they finally get a native CUDA style system. Like ZLUDA, but built into the real drivers.
AMD unfortunately has great hardware and lackluster software
AMD already have that. it is called ROCm. in a way the existence of ZLUDA is kind of mocking what AMD been doing with ROCm.
🙏
@@arenzricodexd4409 ROCm still isn't implemented well with other software. I used Blender and the only way to get it to render with my GPU was with ZLUDA. Until a normal consumer can buy an AMD GPU, download the normal drivers, and run every software nVIDIA can, they will stay behind.
Well, if they expect us to buy a new GPU every year, they better start selling them 75% cheaper.
I think he meant they are generating code that then has to be revised by humans and fixed. So it's probably a lie. I remember them saying all the audio was trasncripted by software and then people figure out they had a lot of people for cheap working on that somewhere around the world. You don't even have to explain what you are wearing, that's right, I don't care, but reese look so sweet actually.
7:57 Maybe reconsider picking the monitor up for hdr since the Q&A on the product page says it only has 384 dimming zones which might not be enough for decent hdr
It has a 1000 nits. If the 384 dimming zones are implemented well, it's fine.
I'm not making a review here, just pointing out it could be good, just as easily as it could be disappointing.
I have I9 12900F, I am not planning to fork out 500-600 euros for either of the new Intel or Amd CPU, not worth it, definitely
I agree. My 5900x has served me well for the last 4 years, and will continue to do so for several more years to come. No point in upgrading every year, especially when it’s very obvious we’re in the middle of a transition period right now. Chips in 2-3 years will be very different than they are now with everything becoming AI
You can literally sleep on that for like 10 years or more.
wtf is a 12900F? some sort of fisherprice "premium" cpu? can't even oc that crap to the 14 gen level
540$ for 12900F? jesus, this thing stinks
@ MrIdiot The 12900F is the OEM SKU. 100 MHz less than K on Turbo / Boost. The Alder Lake SKUs do just as well as Raptor Lake after you take the steps necessary to minimize the Raptor Lake defects. Also, no idea where you found that price. Even 12900K can be found around $200 USD new.
My ex-girlfriend replaced me with a 14 inch vibebrator...does that count as A.I.? 😂
Well... if the machine was trained ,yes is A.I, if it was not trained then is a normal machine.
In any case the machine did a better job than you since she decided to keep the machine instead of you
No. That's AE, artificial electronics, not AI. Doesn't count as AI.
The amount of memory and the memory bandwidth are key enablers of the NPU.
Reese are you a chili pepper for Halloween 😂😭 Best Tech News yet, Happy Halloween UFD Crew! 🎃
Reese should be promoted to Co-host in my opion , you guys work awesome together it could be done even when reese goes back home to south africa , please consider it, love you guys
I also hope intel gets some sales intel needs to survive for us to get good cpus!
Im an AMD fanboy and i agree
Intel will be fine. They will still have a massive amount of computers sold to businesses and prebuilts just like they always do
The likely reason those CPUs didn't sell is because the retailers weren't actually selling them. They have the listing, but no stock. Same in my country, I can see the listings for a 285K for 635 or 645 euros at different sellers and none of them have stock.
from sales perspective intel actually generate more revenue than AMD.
@@RoxyYTP i use a ryzen cpu and 7600 but i rly want intel to make more competitave cpus like amd did when they were the under dog
The way Reese yelled out "AI" 😂
as a dev i use a lot of ai in my work but mostly as little bits, i don't generate entire program but the code completion from copilot definitely saves me a loooot of time bc i'm a slow typer, it's never things i wouldn't have written myself and never more than 2 or 3 lines but yeah we use AI and it does boost productivity if you don't use it to do all your work just to debug it later because it doesn't work
Helmo from Sesamean Street hahaha love it !! 😁🤣😁🤣
I use co-pilot at work for code and I do find it quite useful. And I do find it quite useful
It depends a little bit on what you mean by writing the code though.... It's more like a really good autocomplete.
Like I need to write implementation of an interface that I've already written different versions of like four times, so it'll make that way faster.
Maybe it'll help me write the test.
It's definitely not like I'm telling the AI to go do something and it just goes and does it. It would do a terrible job at that
I work in telecoms and I cannot use chatgpt or other MLMs as a pricing manager, sometimes I would throw a complicated formula at it but generally speaking I use it more privately
When I see your "hotdog" costume, I'm just imagining it with Egyptian Pharaoh headdress textures.
Intel: Nobody is buying chips for DIY Gaming
AMD: Gaming Revenue is down
Is anyone buying anything gaming related?
The entire industry except for Nvidia is down.
Hell Mo and sesamean street is just a great set of opening jokes. Happy to have a break from Brettfast for some Hot Cheetos.
I think Brett said sesameme street 😅
@@jimbodee4043Just take the context and you will understand the Sesamean Street. You heard Reese say Hell Mo, right?
14:40 Looking like a hot Cheeto Pharaoh!
I'm a truck driver. 0% AI. Not that they're not trying.
_'Full USB'_ will mean USB A to a lot of people for a long time
Yeah, pretty sure that's what the commenter meant. And i was just being a jackass with the dongle comment 😉I'm satisfied with lots of USB C ports honestly
I love my base Macbook Air M1 with only an upgrade to 16 GB RAM. It's the perfect light work machine. Now that all of the macs get a minimum of 16 GB, i'm likely to actually buy a mini or poke the people around me who have intel macbooks to finally upgrade.
I'm a janitor/handyman. So... I'm pretty safe unless companies want to buy an automated zamboni thing and then neglect everything else... Oh god I'm screwed.
I love the concept of USB-C, but the implementation has been so terrible. And it's the kind of terrible that was easily avoidable.
People are looking for value.
Too much selection. Like cars. Tesla builds 5 SEXY, Cybertruck. That is all. Everyone knows what they need/want. AMD needs to do the same. Just a few choices depending on what you need.
I really hope it wasn't a paper launch, however likely, and instead there are very sad scalpers crying over piles of 285K boxes they can't sell.
If scalpers are buying high end Intel, they're just not paying attention. Whatever bad happens to their bottom line is well deserved.
75% building design, yard design, data distillation, MTG deck building, system optimization.
I work as a mental health professional so I don't want to speak too soon but I don't think they can replace us with AI. Unless people who struggle with mental health related problems don't care if they are talking to a real person. If so, I may be in trouble.
Sorry Brett, you get a 👎 for saying Apple more than three times. It's like Beetlejuice, you'll have Tim Skeletor Cook showing up, and that's just trouble.
My dad is a computer programmer and they use ai to generate code, then all they need to do is proof read it and it speeds up the coding process by a huge margin.
I’m not gonna lie, the inconsistency with the morning news is starting to wear me down
I never leave any part of my programs or scripts to AI. I find doing it myself easier than telling AI a series of actions and conditions it has to fulfill using code.
The only time I use AI is so I can understand a concept that I don't know how to work with, in which I'll learn the process thanks to the AI and then do it myself, knowing what I previously didn't know before.
The only thing that keeps me from going Apple atm is not having a 2 in 1. As an artist I need a laptop and a tablet and I do not want to carry multiple devices (being it a tablet and laptop or a laptop and a pen tab.) Also I really am not a fan of iOS. Until then I will enjoy my Thinkpad.
15:42 Correction...that's not capitalism that is consumerism.
Where I work, this quarter we actively stopped most AI usage. I think some server stuff is still using it but only for highly specific tasks.
Their gaming is down because they spent money on development of cpu and gpu and they feel they are worth their weight in gold and diamonds.
That 4k monitor seems pretty good for watching video. I really want to have a 27" 4k above my primary display for videos.
I just hope AMD remembers that a lot of their current growth comes indirectly from the gaming innovation and that Nvidia wouldn't be nowhere without it.
well considering they make more from data centers most likely they will give that sector priority
maybe for nvidia but not for AMD. in fact buying ATI is considered one of AMD biggest mistake in their history that eventually lead to a decade long financial woe for the company. if anything gaming almost ruin AMD.
@arenzricodexd4409 but now it's one of the biggest strengths. Without AMD buying ATI, we would not have great APUs, especially in laptops. Mobile market would continue to stagnate. Instead, it's evolving fast. That may not be true for desktop CPUs but given desktop PCs are niche between servers, workstations and laptops, I don't think they have to worry
@@arenzricodexd4409 AMD iGPUs are still an important differentiating factors VS Intel CPUs, that's why they got the console market, and why they successfully entered the handheld market, and in the era of AI, that GPU expertise might turn into a cash cow if they play it right.
@@grandsome1 yes they got that market but my point is not matter how good AMD GPU is console and handheld market can't really generate big revenue. and the last few quarters AMD gaming revenue are getting worse. usually AMD able to get something like 1.5 billion on average in every quarter. worse maybe it was at 1.2 billion. but Q1 this year the revenue drop to 922 million only. people thought that was bad and yet the revenue continue to drop in Q2 (648m) and Q3 (462m).
Love this CHEETOS 🤣🤣🤣
Sesa-mean Street 💀
I wonder if we're also seeing the failed 9000 series launch reflected in those sales figures?
Good morning and Thank you for the information Elmo and Mr. Boneless buffalo chicken wing.
nah man, not a single sale at germanys largest r etailer is embarrassing
Ryzen is Client, Gaming is GPUs and Consoles.
The Cheetos man, turned into a Spanish inquisition nun man at 13:19
i had to stop 5 seconds in so I could get in a laugh session. That elmo hoodie or onesi is hilarious.
15:49 if you can afford that computer, you can afford an adapter to turn the Thunderbolt 5 port into USB A or get a dock. Lose a little function? Maybe, but that's happening with whatever device it would be anyway.
4k 60Hz MiniLED for 250 is insane, despite low refresh rate
Running a 12600kf, was running a R7 5700G (changed due to audio issues thinking it was the APU causing issues) . Honestly performance wise, I hardly notice. As always go for the best price for performance.
Wont be upgrading CPU for another 3/5 years
do want my 3060 swapped out for a 5060/5070 tho
I still want more dongles. More dongles for all! 🔌👈😎👉🔌
If people weren't excited enough to buy RDNA3, I don't see RDNA4 being much different since they're going to be missing out on the high end. I have to wonder if they're even going to have an 800 series or if this is going to be like RDNA1 and the best we get is the 8700 XT. I think they're going to have to have some impressive increase in perf to get people excited to try to shift that number the other direction. And it's not like the PS5 Pro is going to sell a ton of units since that's only going to appeal to a niche segment of console gamers.
gamer in general are not exciting to buy anything. those that already own something like RX6600 or RTX 3060 will be good for a few years.
@@arenzricodexd4409 Sure, but it's not really slowing NVIDIA down. That doesn't really explain the lack of AMD GPU sales.
@@TheGameBench because nvidia gaming GPU are much more useful than just gaming. don't be surprise if 70% of geforce buyer are not gamer at all. now even companies are buying nvidia GPU for their official workstation solution non just semi pro. in china for example their cloud compute provider end up buying 4080 and 4090 modded them with more memory instead of buying the significantly more expensive H20. that's why there are GDDR6X shortage causing nvidia to release 4070 with regular GDDR6. AMD saw nvidia success with this hence why UDNA happen.
@@arenzricodexd4409 I'm only speaking to their gaming GPU's. I'm not even referring to their workstation or AI cards. That being said, AMD GPU's are useful outside of gaming as well. They trade blows with NVIDIA in productivity workloads. It really depends on what you're doing.
😢
I play around with AI code generation, but have NEVER straight copy pasted from chat GPT to a project.
It is great for drumming up ideas or alternate approaches though.
16:44 a good thunderbolt dock costs over 100$ where I am from and I need to buy it to use most of my accessories even a mouse or keyboard. It's not people being stupid and not accepting USB c it's not wanting to make a separate purchase just to use the Mac mini I would have to spend minimum 1000$ on
Dude. AMDs video cards have never been better.
I bought a 6950xt and all my homies switched this year from team green. The drivers are awesome and the performance is great.
not an issue of how good AMD GPU are in gaming. gamer in general not buying. be it on PC or console.
@arenzricodexd4409 and?
I'm simply saying it's sad more people aren't aware of how good their cards are right now. Their drivers have come so far and overclocking is such a nice process being native to the driver.
@@backwoods357 and? The news is about AMD gaming revenue which is usually 70% of them coming from semi custom chip (console, handheld). It is not an issue of low sales because AMD gpu performance
@@arenzricodexd4409 in the video he addresses the console situation, as well as talking about the gaming cards. I never said the issue with their sales numbers is entirely or even largely related to enthusiast GPU sales. I merely wanted to point out how good their cards are at the moment as well as how far the drivers have come. A lot of people just blindly buy green because they are conditioned to think it's the only option.
11:00 Only in gaming... When you can 10x your profits from moving from gaming to AI customers.... you kinda have to... AMDs biggest problem is that CUDA rules and the open source framework version is just no where near as good. It will take a few more years to catch up and a few more to get people to use it as preference.
AMD next need to get in on the ARM cpu / gpu... Qualcom is having licensing issues... Some AMD magic on some ARM ip... could be some real contenders. MS is in the way though with their lacklustre minimum effort ARM offerings of windows. Windows has borked. A global system registry? lolz self contained apps? lolz. Could all be done so much better with a new approach. How about UI consistency? How about just 1 way to do configs? Windows needs to split. Classic & New. Classic for people who 100% must have LTS. New - for customers who need cheaper to own / run / administer machines.
So big tech is laying off in droves while expanding their data centers. Explains why gaming software/hardware sales have dropped. We broke as hell dawg....
That’s probably why AMD is hiking the price of the 9800X3D by $30
Elmo is the OG hot cheeto
NONE>>> AM I THE ONLY PERSON WHO REMEMBERS WHAT HAPPENED IN THE TERMINATOR MOVIES?
I'm waiting for Google to owe a Googol tbh
Much of the nvidia "consumer" segment is still related to smaller data centers buying consumer gaming GPUs. Remove that aspect and nvidia is doing worse than AMD.
Lmao ya no, we don't want the War of the Five Kings with Valve as the battleground.
no one is going to buy anything until black friday super sales, especially when we are all waiting for new stuff at ces next year which is... 2 months away
I mean most of the code is basically just boilerplate in most scenarios.
AMD was doing GREAT in Q4 of 2023. NVidia was spitting BLOOD and selling NOTHING in the midrange. At Germany's "MindFactory" AMD was outselling NVidia in both UNITS and in AVERAGE SELLING PRICE!! In January NVidia was forced to lower prices by 20% - the 4070 Super = 4070 Ti, the 4070 Ti = 4080--, and the 4080 Super = 4080 - $200 in price!! With these changes, NVidia turned the tables on AMD and made the 7900xt, the 7900gre, and the 7800xt irrelevant and therefore, AMD has been hurting for sales since February of 2024.
I... have the same Halloween costume!
It would seem that nobody is buying an MS or Sony console by the looks of things which really isn't surprising considering there price. AMD needs to drop prices considerably instead of trying to match nvidia on price and they also need to get there power usage down.
It's funny, I thought the new intel CPU's were sold out in the US because of the whole thing about them being the first desktop CPU versions to have an integrated NPU, but I guess it was because of the very low stock on the shops 😂
The next person to care about NPU, will probably be the first. AI and NPU is being pushed way too hard on consumers who don't give two shits.
happy halloween to the cheeto and larry david's op
I'll boist their sales when they finally release the 9950X3D
im both weary about ai being just thrown onto mac and ios without abandon
and
really pleased that base model macbooks pro, air and mac mini's are the best bang for buck for productivity and general use
16gbs of ram across the board is great
i know that theres the walled garden apple has wrapped around ai models to do them on device but even then image gen and agent models just make me uncomfortable top to bottom
AMD Ryzen 9 and 4070 NVDIA RTX is an absolute beast of a mix.
These two are absolute clowns.
9:00 Why didn't Intel take the opportunity to shake things up and release an iGPU that soundly beats a 780m? If they knew their CPU was nothing 'special', why not give people a reason to hop on the train?
Sea Sam mean Street 😂😂😂
1/4 AI, my bio teacher let us to make up to 1/4 of our paper with the help of AI
You know, I can't switch from the Steam Deck until any one of these "more powerful" handhelds have touchpads, it's just way too useful.
I just can't play fps games with sticks, is that an unpopular opinion?
AI generated code is great for unit tests. Easily can have that account for 25%
And boiler plate code. Even before AI a lot code was already done by code gen.
Before he said Hot Cheeto my brain instantly went to Patrick from Spongebob. But red for some reason. Maybe a skin condition.
Next quarter will be much better for AMD. A lot of people held off buying anything whilst waiting for Zen 5 and Arrow Lake (myself included) and now that both flopped, last gen AMD CPUs and AMD parts in general are flying off the shelves. I finally built my PC last month after seeing Zen 5% due to the 6 free games bundle from AMD and cashback from ASUS.
16:29 can we keep saying dongle? Just dongle all day guys
A single USB-A port(and SD slot) is all the Mac Mini needs.
I have 3 things i need to plug into my Mac
My network
My display
And my cheap $50 dock so i can connect my keyboard, mouse, and card reader, and no i refuse to use bluetooth
Honestly, this is more rediculous than marketing this for photographers and not including a fast SD slot like the Mac Studio
Apple is all about taking the research out of the buying experience, and now you expect your users to research which external card readers are good enough for their needs? Honestly been mad about the removal of this slot, its so easy to have 2 chassis, one with and one without if its an aesthetics thing
iMAc though, why wasnt the SD card reader on the bottom of the screen of those models.
At least they moved some ports to the front, absolutely love that, but now that the power button is on the bottom, why isnt it on the front.
Heck Apple, you're so keen on putting important things on the bottom, why not put an SD Card reader there, so that when there isnt a card, you cant see it, but when you plug in the card, it sits flush with the front of the aluminium shell
Yall are too awesome
If only laptops had a keyboard
Laptop keyboards kind of suck and you can't put the monitor screen at that angle and use the laptop keyboard at the same time. Laptops have track pads too but I'd take a mouse over those anytime.