Made the transcript readable. Too bad I can't add it as a subtitle tho... 0:00 When Nvidia announced that its new Pascal consumer GPU's wouldn't be using 0:04 HBM video memory, 0:06 some folks were left feeling a little confused. 0:09 However that isn't to say that they didn't change anything about their vram 0:14 as these cards are using the also newly released gddr5X. So what's special about 0:21 its other than the fact that it has an x in its name, which might be trying to 0:26 remind people of words like exciting, 0:29 extreme or too expensive? 0:33 Well, that might depend on who you buy your components from. So instead let's 0:38 talk about what GDDR5X actually is. 0:41 It's a type of video ram which does things like hold textures, store images 0:46 in frame buffers and provide the GPU with lighting information. ensure that holds 0:51 the data that your GPU actually has to process so that your computer will 0:55 display your game properly but the reason question is what's special about 1:00 GDDR5X in particular, and is it going to make your games look any better and run 1:05 faster? 1:06 Well, usually vram tends to matter more the higher the resolution your games are 1:10 running at, making it especially important for multi-monitor setups or if 1:14 you're trying to run things with aggressive anti-aliasing or high res 1:19 texture settings, as these things require more stuff to be held in memory and with 1:25 games becoming ever more detailed than ever, higher resolution monitors are becoming more 1:29 common and cheaper. Quicker memory is also becoming more important. 1:34 Graphics memory has placed an emphasis on wide bandwidth for a long time, but, 1:39 GDDR5X improves upon the existing GDDR5 standard by allowing twice as much data 1:47 to be accessed at one time. 1:49 64 Bytes instead of 32, and, while this may not sound like much, we should see 1:54 overall transfer rates of around 10 gigabits per second or even higher per pin, 1:59 and nearly four hundred fifty gigabytes per second of overall bandwidth, while 2:04 drawing less power than its predecessor. Pretty sweet, but slow down just a minute: 2:09 what about that fancy new HBM that AMD 2:12 came out with? Why is GDDR5X a big deal then? 2:17 While it's true that HBM, which you can learn more about here, has higher 2:21 throughput and has a smaller footprint, which are both great things on your 2:25 graphics card, specifically than GDDR5X (#fail), GDDR5X should be quite a bit easier not 2:32 to mention cheaper to implement then HBM, since it's still quite similar to the 2:37 ubiquitous GDDR5 and although both Nvidia and AMD are heavily rumored to be 2:43 moving towards HBM2. Oh, for some of their upcoming architectures GDDR5X 2:49 might appear on more mid-range future graphics cards as well as the greater 2:54 accessibility of high-res displays and the growing interest in virtual reality 2:58 headsets will make high throughput vram increasingly important. Currently GDDR5X 3:06 is only available on Nvidia's new flagship: the Geforce GTX 1080. 3:10 But, don't be surprised to see it paired with more GPU's down the road and whether 3:15 you're going for GDDR5X or HBM, pay attention to what kind of vram is in 3:20 your next card if you want to get into surround gaming hook up an Oculus Rift 3:24 HTC vive or just really really like things with X's in them.
Same, geez, I had to visit their channel. The newest video in my subscription feed is 21 hours old even though I there should probably be like 30 new videos there.
***** i did not mean HBM and did indeed refer to High Dynamic Range (HDR) as i want to know do if need a HDR monitor or if all monitors with good colour range profit? (IPS)
+Redline You need a monitor with displayport 1.3+ or hdmi 2.0a/b+ and the monitor must support 10bit colour depth per channel (over 1 billion colours instead of today's 8bit depth - over 16 million colours). It increases the colour contrast and brightness, too. That's it.
@@tylerdurden3722 wonder if GDDR6X will be abandoned for GDDR7 in one or two years since it is not JEDEC standard but made by Micron specifically for Nvidia (like GDDR5X back in 2016/2017)
@@tHeWasTeDYouTh eventually, yes. GDDR5 is based on DDR3 GDDR6 is based on DDR4 Intel already has CPUs with DDR5 memory controllers in the pipeline. So my guess would be, that GDDR7 is not far behind DDR5. It would most likely have independent read and write, just like DDR5. I personally think that Samsung's HBM, that can work without an interposer, will eventually take root in mobile devices and then dominate everywhere else. GDDR6X is actually quite special. It doesn't really function with just 1's and 0's. It's not binary. It functions with 0, 1, 2 and 3. It has more states than just "On" or "Off". It uses a quaternary number system. Called PAM4. Ethernet and SSD's are already using such higher number systems (that are not really binary). GDDR6X takes inspiration from those technologies. This allows more information to be stored/transferred, per clock. It would be a shame if this is abandoned.
Hey! That's some damn good editing and lighting in this video on top of Luke's ever increasingly improved articulation. You guys just keep getting better and better at this. Thumbs up!
yeah, compared to last years Luke, who was like stored into a freezer, stiff as chopsticks, now he looks more alive. didnt like him before because of stiffness before
Why do all of your videos that you take in front of the green screen still have so much green in the picture, it really irritates me how green lukes skin looks on the "edges" of his body... LMG...pls fix :D
+Joe Bob please stop embarrassing yourself... There are multiple manufacturers of GDDR5X and ofc they have different characteristics. Nvidia uses Micron's version which is currently the slowest there is on the market (10Gb/s).
It's when your cpu is far weaker than your gpu, and thus "bottlenecking" it, which prevent your gpu from ever reaching it's full potential. You could fix that either by getting a new cpu, or just overclock the hell out of it :P
The GDDR5X SGRAM (synchronous graphics random access memory) standard is based on the GDDR5 technology introduced in 2007 and first used in 2008. The GDDR5X standard brings three key improvements to the well-established GDDR5: it increases data-rates by up to a factor of two, it improves energy efficiency of high-end memory, and it defines new capacities of memory chips to enable denser memory configurations of add-in graphics boards or other devices. What is very important for developers of chips and makers of graphics cards is that the GDDR5X should not require drastic changes to designs of graphics cards, and the general feature-set of GDDR5 remains unchanged (and hence why it is not being called GDDR6).
WHAT, Nvidia can use hbm technology why would AMD let its competition use a tech that they created???? that doesn't make sense they could take the lead in the market share next year in both CPU AND GPU sectors with Ryzen and Vega, and having hbm 2 be a exclusive AMD thing, would be a much better economic and business decision for them but Hey I guess they don't seem to be doing so odd
I love the mentions to other LMG videos, the moment they come up I open the link up in another tab for when I'm done with the current video. Beats Playlists where I've seen some vids already and I into the zone of enjoying non-stop LMG content.
gddr5x is more expensive than gddr5. Given this is the first " generation" of gpu they have chosen to implement this on, they had to chose this on the highest end single-gpu variant. The next gen will probably have gddr5x on the xx70 and the xx60 models while the xx80 an xx80ti will have HBM or HBM2.
And HBM isn't at this moment? It will get important within a few years, soooo yeah... It's good that there is production today, because "the new of today is the mainstream of tomorrow" to quote Linus (sortof)
Those prices are very reasonable considering performance, I mean if you have a job and all. Even minimum wage can save up for literally a single month working 25-30 hours a week and have extra left over.
I have done programming in CUDA environment. The stuff which is GDDR.. and all is considered to be slower. GPU has mulitple stages of memory 1) First the Global Memory which is what you see on the headlines, 2GB... 3GB and all 2) L2 memory 3)L1 memory 4) Register memory The fastest being Register memory and it is instantaneous and slowest being Global Memory. Now, there is another bottle neck to the overall GPU thing is the transfer from the RAM to GPU Global Memory. The architecture of GPU is that it is not independent like CPU. It has dependency on CPU and when CPU transfers memory to the GPU and instructions, it will run. This PCI express is the biggest bottle neck in the performance of the GPU. But when I heard that Hybrid memory is arriving where the CPU and GPU share the same memory address, it was great as it was addressing this problem but again I don't know how far that technology has evolved. But I hope that technology evolves. Codes need to be rewritten to fix that situation but can be managed in the libraries only so that the end dev is not affected.
+Tyler Weigand He wants him to talk about the modules AMD CPUs use, which essentially is two small cores making a large core, making AMD 8 core CPUs really 4 core CPUs. It's similar to hyperthreading but theoretically better.
Tyler Weigand At the time, I didn't quite understand what you were saying. Now I realise you were asking that they explain AMD's module architeture with bulldozer. Long story short, one arithmetic logic unit shared between two floating point units. Slightly longer, those FPUs are modified so that they can achieve higher throughput by taking a single command, splitting it into two half the length, processing them separately, and then recombining then, not the conventional idea of multithreading.
My Prediction: 1080ti will have HBM2 memory- IF NOT , then the "new Titan X of 1000 series) will , eitherway, if 1080ti does not have HBM2 memory, i wont buy it. unless the performance of 1080ti is powerfull enough regardless the HBM2 memory
Luke has become a way better performer! Look how he has improved! Great performance mate! I am not even talking about Sebastian. You would think that Linus had the ability to be natural on camera from birth though he also developed that ability in the process.
while you guys were talking about how gddr5x is bettwr than HBM. the pic shown in background of a gpu has 2 PCIe stick or whatever it is called. what gpu is that which uses 2 PCIe slots??
that would require a new memory type (node, transfer signals etc) this is gddr5 but with 64bytes sent instead of 32bytes during an access, similar to pcie versions.
Here's a question and idea for a possible future video Is it possible to limit the vram of a video card and/or the speed of the vram? The reason I ask is because I'm wondering how much faster the GPU itself of the 1080ti is vs the 1080. We already know the 1080ti uses gddr5x vram and there's a bit more on three ti vs the standard 1080. My goal is to see how much faster the actual you dye is. To be clear I already own a 1080ti. (Gigabyte aorus gaming - non-extreme edition cuz money) but I'm just curious how it would stack up on a more level playing field vs it's non to counterpart
On HBM: "Yeah, the R9 Fury Series has that on itself, but it is rumored to be on Pascal in the next generation and you know, High-End today is the Mainstream tomorrow!" On GDDR5X "Look at the nVidia Graphics Cards featuring this all new and nice memory type!! We even give you a spec sheet of the GTX 1080, because you need that for GDDR5X to know what's it all about." Talking about bias.
You could have improved on the organization of your presentation; it may be confusing for people who aren't tech savvy and watch your videos. In the video, you kept asking the questions (What is GDDR5X, what's special about GDDR5X in particular, etc.) but then the next thing you talk about is a definition of regular VRAM and what regular VRAM does, respectively, before going into actually answering the question. The viewers expect the answer after the question, not after a whole bunch of info dump. The video would have been a lot better if you first went into describing VRAM and then go into talking about GDDR5X specifically.
Graphics Cards GDDR5X to DDR5 to GDDR6 PCI Express work in the same motebord for example ASUS Crosshair V Formula-Z work a gtx 1080 to GeForce RTX 2080 Ti GeForce RTX 2060
There's a gtx 1080 for sale on Newegg (us) for $480. Can anyone tell me if there's any information if AMD will release a GTX 1080 (NON ti) competitor? I know they will have Vega to compete with the 1080 ti but honestly that's a price range I have zero interest in entering.
8 років тому
same socket though right you wont have to change mb?
This man's voice is extremely irritating, especially when your Bluetooth speaker is at the absolute furthest of range from your laptop and his hoarse, halting, raspy, drunk-step-father-trying-to-talk-nice-to-the-step-daughter-he's-abusing-nightly-esque vocal cords are pushing your ear drums and patience to the max. It's not just the speaker, either - play this anywhere, close your eyes and listen. Good luck making out what he's trying to say as he drowns in each word he spits. Something tells me that if the camera isn't a good 15 feet away it will get soaked in gin-infused spittle.
TL:DR It's a stopgap. If an old r9-290 was fitted with the same gddr5x as a 1080 it'd have 640GB/s of memory bandwidth, As it is a lowly r9-390 has to make do with only 384GB/s.
Ohhhhh so 10 gigabits per PIN, not total. That makes a lot more sense instead of listing the memory bandwith at 10 gbps. lol i got confused in the benchmark vid. So 450 GB/s. makes a lot more sense.
why should gddr5x be better than 512bit gddr5 used by amd? nvidia never used more than 384bit in last generations because is cheaper that way but is gddr5x better than amd vram in some way?
im thinking this should be on the forum, but why does increasing the memory clock speed give better results than increasing the core clock speed in certain games, (happened in payday 2)
You should do a video explaining graphics gard instability. I have a crappy Gigabyte GTX 960 and it can't be overclocked AT ALL (if I try, it immediately gets artifacts and crashes) and I've always wondered why.
Maybe this 960 is factory overclocked. My Zotac GTX770 AMP! is heavely overclocked and on my specific Card there is no fucking way to increase the Core Clock, but my Memory can increase to an Offset of 300Mhz. Do you tried to increase your Voltage Level, in Msi Afterburner you have to enable this function ;)
You spelled extreme wrong. Any child from the 90's knows that it is spelled XTREME!
I'm a kid from the 90s and I approve this message
I'm a message from the approve and I 90s this kid
I hurt my potato and I approve this taco
Bryan A
I'm a 9gager and I approve your potato
I'm a child from the 90's and I approve this comment stream
Luke's hand movements aren't smooth, LTT needs 60fps
But that's not cinematic
But for that we go to the movies.
+Lati Sullivan tell that to the console peasants
Zylixel I don't speak to those :P
+potato at least he can afford nice things. Nicer than a Clinton supporter like you...
What is this thing called "counter strike global offensive". Pssshhh must be some weird Nintendo game.
"How to learn Russian in 1 day"
I'm a virgin and this comment triggers me
It's a minigame for candy crush
+Tomshotz im a comment and this virgin triggers me
Its an amazing xbox game, it changed my life
Made the transcript readable. Too bad I can't add it as a subtitle tho...
0:00 When Nvidia announced that its new Pascal consumer GPU's wouldn't be using
0:04 HBM video memory,
0:06 some folks were left feeling a little confused.
0:09 However that isn't to say that they didn't change anything about their vram
0:14 as these cards are using the also newly released gddr5X. So what's special about
0:21 its other than the fact that it has an x in its name, which might be trying to
0:26 remind people of words like exciting,
0:29 extreme or too expensive?
0:33 Well, that might depend on who you buy your components from. So instead let's
0:38 talk about what GDDR5X actually is.
0:41 It's a type of video ram which does things like hold textures, store images
0:46 in frame buffers and provide the GPU with lighting information. ensure that holds
0:51 the data that your GPU actually has to process so that your computer will
0:55 display your game properly but the reason question is what's special about
1:00 GDDR5X in particular, and is it going to make your games look any better and run
1:05 faster?
1:06 Well, usually vram tends to matter more the higher the resolution your games are
1:10 running at, making it especially important for multi-monitor setups or if
1:14 you're trying to run things with aggressive anti-aliasing or high res
1:19 texture settings, as these things require more stuff to be held in memory and with
1:25 games becoming ever more detailed than ever, higher resolution monitors are becoming more
1:29 common and cheaper. Quicker memory is also becoming more important.
1:34 Graphics memory has placed an emphasis on wide bandwidth for a long time, but,
1:39 GDDR5X improves upon the existing GDDR5 standard by allowing twice as much data
1:47 to be accessed at one time.
1:49 64 Bytes instead of 32, and, while this may not sound like much, we should see
1:54 overall transfer rates of around 10 gigabits per second or even higher per pin,
1:59 and nearly four hundred fifty gigabytes per second of overall bandwidth, while
2:04 drawing less power than its predecessor. Pretty sweet, but slow down just a minute:
2:09 what about that fancy new HBM that AMD
2:12 came out with? Why is GDDR5X a big deal then?
2:17 While it's true that HBM, which you can learn more about here, has higher
2:21 throughput and has a smaller footprint, which are both great things on your
2:25 graphics card, specifically than GDDR5X (#fail), GDDR5X should be quite a bit easier not
2:32 to mention cheaper to implement then HBM, since it's still quite similar to the
2:37 ubiquitous GDDR5 and although both Nvidia and AMD are heavily rumored to be
2:43 moving towards HBM2. Oh, for some of their upcoming architectures GDDR5X
2:49 might appear on more mid-range future graphics cards as well as the greater
2:54 accessibility of high-res displays and the growing interest in virtual reality
2:58 headsets will make high throughput vram increasingly important. Currently GDDR5X
3:06 is only available on Nvidia's new flagship: the Geforce GTX 1080.
3:10 But, don't be surprised to see it paired with more GPU's down the road and whether
3:15 you're going for GDDR5X or HBM, pay attention to what kind of vram is in
3:20 your next card if you want to get into surround gaming hook up an Oculus Rift
3:24 HTC vive or just really really like things with X's in them.
ñ
thank you lol
why...just why would you do this
Great, the UA-cam sub box has broken again. I got to this video purely by luck.
Worked for me, so idk, UA-cam is weird :D
Same, geez, I had to visit their channel. The newest video in my subscription feed is 21 hours old even though I there should probably be like 30 new videos there.
+Sezze's Stuff same for me
TF2Maps?
Am I the only one who looks though the suscriber's to find vids because half of them dont show up?
03:50 Woah Luke t-shirt changing color
LinusShirtChangingTips
he also had a haircut
3:30*
How many times did he change shirts and haircut, lol?
HIS FUCKING NIPPLES.
*AMD's HDR as fast as possible.*
+kowaletzki think he meant HBM
I think he really meant HDR which is part of the new Display Port interface, 1.4. AMD RX480, or example, is an upcoming GPU which supports this.
***** i did not mean HBM and did indeed refer to High Dynamic Range (HDR)
as i want to know do if need a HDR monitor or if all monitors with good colour range profit? (IPS)
***** "as far as i know" that's why i want a as fast as possible :)
+Redline You need a monitor with displayport 1.3+ or hdmi 2.0a/b+ and the monitor must support 10bit colour depth per channel (over 1 billion colours instead of today's 8bit depth - over 16 million colours). It increases the colour contrast and brightness, too.
That's it.
you do not put NVIDIA and CHEAPER into the same sentence.........
You just did...
It is 2019 and everyone abandoned GDDR5X for GDDR6
lol
Just thought about that XD
And now there's GDDR6X 😅
@@tylerdurden3722 wonder if GDDR6X will be abandoned for GDDR7 in one or two years since it is not JEDEC standard but made by Micron specifically for Nvidia (like GDDR5X back in 2016/2017)
@@tHeWasTeDYouTh eventually, yes.
GDDR5 is based on DDR3
GDDR6 is based on DDR4
Intel already has CPUs with DDR5 memory controllers in the pipeline.
So my guess would be, that GDDR7 is not far behind DDR5. It would most likely have independent read and write, just like DDR5.
I personally think that Samsung's HBM, that can work without an interposer, will eventually take root in mobile devices and then dominate everywhere else.
GDDR6X is actually quite special. It doesn't really function with just 1's and 0's. It's not binary.
It functions with 0, 1, 2 and 3. It has more states than just "On" or "Off". It uses a quaternary number system. Called PAM4.
Ethernet and SSD's are already using such higher number systems (that are not really binary). GDDR6X takes inspiration from those technologies. This allows more information to be stored/transferred, per clock. It would be a shame if this is abandoned.
So we're slowly transcending to luketechtips?
Video editor on point today!
Higher resolution monitors becoming cheaper?? lmao thats a good joke Luke, youre a funny guy!
When you get everything for free it's easy to be disconnected.
Yes it's getting cheaper, compare today's models with last year's
+bathrobeheroo I don't consider $350 for a monitor cheaper.
For a 4k monitor $350 ish, is very acceptable seeing as they're usually $500+
Cheap: "Costing little money"
Cheaper: "Costing less than usual or exptected"
AMD's Gameworks As Fast as Possible.
GDDR6X in 2020 be like : Hello there
Damn, Luke is quick with changing his shirt!
Great timing for this video to be released since my 6 year old graphics card finally died last night 😂
So from 6 year old to highest possible new end? Great choice! xD
Isaax Ikr xD
OK something tells me that the 6 year old CPU will bottleneck your GPU
Dustin M no shit. But it's my main rig right now and I need a dedicated GPU
+Ttomisabeast I figured that you would know already but I posted it just in case
it's awsome to see Luke here :D
3 SHIRTS IN 1 VIDEO? EXTRAORDINARY!
i guess next year's april fools video will be a techquickie of a extremely specific topic to spoof videos like this one
Would be interesting to see a techquickie about SSH
Hey! That's some damn good editing and lighting in this video on top of Luke's ever increasingly improved articulation. You guys just keep getting better and better at this. Thumbs up!
yeah, compared to last years Luke, who was like stored into a freezer, stiff as chopsticks, now he looks more alive. didnt like him before because of stiffness before
Why do all of your videos that you take in front of the green screen still have so much green in the picture, it really irritates me how green lukes skin looks on the "edges" of his body... LMG...pls fix :D
1080 uses the slowest and cheapest gddr5x there is.
It uses the ONLY gddr5x there is, so its technically the fastest and most expensive version there is fuck boy.
+Joe Bob please stop embarrassing yourself... There are multiple manufacturers of GDDR5X and ofc they have different characteristics. Nvidia uses Micron's version which is currently the slowest there is on the market (10Gb/s).
Luke looks so seasoned.. He's like fuck my life
He explained everything except what actually is GDDR5X 😬
1:39
that knuckle crack 1:14
make a video about " what is cpu bottleneck "
im sure they have made one, might not be called that but something like it
It's when your cpu is far weaker than your gpu, and thus "bottlenecking" it, which prevent your gpu from ever reaching it's full potential.
You could fix that either by getting a new cpu, or just overclock the hell out of it :P
+I'ma Rob'Oat i think he knows that already
I'ma Rob'Oat thanks alot
+Adon Wullschleger then why would he ask?
i like how Luke turned green when a background was applied to the white surface behind him :)
i know its a greenscreen, still funny though.
Anyone else notice how baked he was in this video?
AOC 4k monitor with Freesync and 1ms reaction time only sells for 350 bucks? How is that even possible??
The GDDR5X SGRAM (synchronous graphics random access memory) standard is based on the GDDR5 technology introduced in 2007 and first used in 2008. The GDDR5X standard brings three key improvements to the well-established GDDR5: it increases data-rates by up to a factor of two, it improves energy efficiency of high-end memory, and it defines new capacities of memory chips to enable denser memory configurations of add-in graphics boards or other devices. What is very important for developers of chips and makers of graphics cards is that the GDDR5X should not require drastic changes to designs of graphics cards, and the general feature-set of GDDR5 remains unchanged (and hence why it is not being called GDDR6).
great video! I thought all the new GPU were coming with HBM and this GDDR5x was so unexpected! 8/8 Luke
This episode looks much much better than the old ones. The animation and video effects are awesome. Did tech Techquickie hired some new guys?
isn't hbm 2.0 available to amd only due to their partnership?
i am from future and GDDR6 is available in mid range pc
WHAT, Nvidia can use hbm technology why would AMD let its competition use a tech that they created????
that doesn't make sense they could take the lead in the market share next year in both CPU AND GPU sectors with Ryzen and Vega, and having hbm 2 be a exclusive AMD thing, would be a much better economic and business decision for them but Hey I guess they don't seem to be doing so odd
Why did Luke change his T-shirt mid-way, as fast as possible!!
I love the mentions to other LMG videos, the moment they come up I open the link up in another tab for when I'm done with the current video. Beats Playlists where I've seen some vids already and I into the zone of enjoying non-stop LMG content.
Why didn't the 1070 get GDDR5X?
gddr5x is more expensive than gddr5. Given this is the first " generation" of gpu they have chosen to implement this on, they had to chose this on the highest end single-gpu variant. The next gen will probably have gddr5x on the xx70 and the xx60 models while the xx80 an xx80ti will have HBM or HBM2.
on gtx 1070 with custom pcb and extra power phases to the vrm, you'll be able to overclock the gddr5 ram to near gddr5x ram speeds
and then you can overclock the GDDR5X to negate that and get the lead again.
+_ Skylake _ oc gddr5 to 11ghz?? dafuq no way. the 8ghz on the 1070 is already near max for gddr5.
+Omar Abdel Aziz nope +700mhz on these chips are doable. that puts it at 9400mhz and g5x is at 10000mhz so the gap isnt that big :)
The editing omg
0:28 someone please make a reaction gif of this
What is it? Nvidias way to charge more and cheap out xD waiting on that RX 480 and 490
He can still wait for the 580 Vega GPU coming next year.
EDIT: I'm mistaken.
And HBM isn't at this moment? It will get important within a few years, soooo yeah... It's good that there is production today, because "the new of today is the mainstream of tomorrow" to quote Linus (sortof)
Those prices are very reasonable considering performance, I mean if you have a job and all. Even minimum wage can save up for literally a single month working 25-30 hours a week and have extra left over.
+CaptainZuzlike isnt Vega the 490 and up?
+CaptainZuzlike 580 won't be vega. Vega will be this gen's equivalent of a Fury and Fury X
I have done programming in CUDA environment. The stuff which is GDDR.. and all is considered to be slower. GPU has mulitple stages of memory
1) First the Global Memory which is what you see on the headlines, 2GB... 3GB and all
2) L2 memory
3)L1 memory
4) Register memory
The fastest being Register memory and it is instantaneous and slowest being Global Memory. Now, there is another bottle neck to the overall GPU thing is the transfer from the RAM to GPU Global Memory. The architecture of GPU is that it is not independent like CPU. It has dependency on CPU and when CPU transfers memory to the GPU and instructions, it will run. This PCI express is the biggest bottle neck in the performance of the GPU. But when I heard that Hybrid memory is arriving where the CPU and GPU share the same memory address, it was great as it was addressing this problem but again I don't know how far that technology has evolved. But I hope that technology evolves. Codes need to be rewritten to fix that situation but can be managed in the libraries only so that the end dev is not affected.
this comment tho ! this is totally AFAP
hbm = hearty bowel movement.
why youtube ads come to me in French iam not a Frenchman or live in french
+C Smith O-O
+Stuart Mortimore yes
UA-cam is not completely reliable sometimes حيدر الشامي
I really liked that keyboard. Too bad i need to register to surf on massdrop. Not doing that.
Do AMD modules (like with the FX8350 8-core CPU) !
what?
What are you saying?.....
+Tyler Weigand He wants him to talk about the modules AMD CPUs use, which essentially is two small cores making a large core, making AMD 8 core CPUs really 4 core CPUs. It's similar to hyperthreading but theoretically better.
Tyler Weigand At the time, I didn't quite understand what you were saying.
Now I realise you were asking that they explain AMD's module architeture with bulldozer.
Long story short, one arithmetic logic unit shared between two floating point units. Slightly longer, those FPUs are modified so that they can achieve higher throughput by taking a single command, splitting it into two half the length, processing them separately, and then recombining then, not the conventional idea of multithreading.
Video starts at 0:00
Thank me later.
My Prediction: 1080ti will have HBM2 memory- IF NOT , then the "new Titan X of 1000 series) will , eitherway, if 1080ti does not have HBM2 memory, i wont buy it. unless the performance of 1080ti is powerfull enough regardless the HBM2 memory
Luke has become a way better performer! Look how he has improved! Great performance mate! I am not even talking about Sebastian. You would think that Linus had the ability to be natural on camera from birth though he also developed that ability in the process.
anyone realize luke's shirt is changed?
while you guys were talking about how gddr5x is bettwr than HBM. the pic shown in background of a gpu has 2 PCIe stick or whatever it is called.
what gpu is that which uses 2 PCIe slots??
um, I'll help you, the ports in the upper side are SLI ports, not PCIe. So the card has a single PCIe port.
OVR ohhh...okay....so thats where the sli bridge is commected?
Yes indeed.
OVR thanks a lot brother :)
lmao
3:28. shirt changed.
lumminatiu confirmws.
If the transfer rate doubled, shouldnt it then be calles ddr6?
that would require a new memory type (node, transfer signals etc) this is gddr5 but with 64bytes sent instead of 32bytes during an access, similar to pcie versions.
Here's a question and idea for a possible future video
Is it possible to limit the vram of a video card and/or the speed of the vram? The reason I ask is because I'm wondering how much faster the GPU itself of the 1080ti is vs the 1080. We already know the 1080ti uses gddr5x vram and there's a bit more on three ti vs the standard 1080. My goal is to see how much faster the actual you dye is.
To be clear I already own a 1080ti. (Gigabyte aorus gaming - non-extreme edition cuz money) but I'm just curious how it would stack up on a more level playing field vs it's non to counterpart
On HBM: "Yeah, the R9 Fury Series has that on itself, but it is rumored to be on Pascal in the next generation and you know, High-End today is the Mainstream tomorrow!"
On GDDR5X "Look at the nVidia Graphics Cards featuring this all new and nice memory type!! We even give you a spec sheet of the GTX 1080, because you need that for GDDR5X to know what's it all about."
Talking about bias.
it's Rudolph!
Too bad that particular monitor he showed has horrible input lag....
You could have improved on the organization of your presentation; it may be confusing for people who aren't tech savvy and watch your videos. In the video, you kept asking the questions (What is GDDR5X, what's special about GDDR5X in particular, etc.) but then the next thing you talk about is a definition of regular VRAM and what regular VRAM does, respectively, before going into actually answering the question. The viewers expect the answer after the question, not after a whole bunch of info dump. The video would have been a lot better if you first went into describing VRAM and then go into talking about GDDR5X specifically.
Graphics Cards GDDR5X to DDR5 to GDDR6 PCI Express work in the same motebord
for example ASUS Crosshair V Formula-Z work a gtx 1080 to GeForce RTX 2080 Ti
GeForce RTX 2060
There's a gtx 1080 for sale on Newegg (us) for $480. Can anyone tell me if there's any information if AMD will release a GTX 1080 (NON ti) competitor? I know they will have Vega to compete with the 1080 ti but honestly that's a price range I have zero interest in entering.
same socket though right you wont have to change mb?
This man's voice is extremely irritating, especially when your Bluetooth speaker is at the absolute furthest of range from your laptop and his hoarse, halting, raspy, drunk-step-father-trying-to-talk-nice-to-the-step-daughter-he's-abusing-nightly-esque vocal cords are pushing your ear drums and patience to the max. It's not just the speaker, either - play this anywhere, close your eyes and listen. Good luck making out what he's trying to say as he drowns in each word he spits. Something tells me that if the camera isn't a good 15 feet away it will get soaked in gin-infused spittle.
TL:DR It's a stopgap.
If an old r9-290 was fitted with the same gddr5x as a 1080 it'd have 640GB/s of memory bandwidth, As it is a lowly r9-390 has to make do with only 384GB/s.
What is blast processing? My Sega Genesis has it
Ohhhhh so 10 gigabits per PIN, not total. That makes a lot more sense instead of listing the memory bandwith at 10 gbps. lol i got confused in the benchmark vid. So 450 GB/s. makes a lot more sense.
I like Xs as much as NCI. hold on, they put an X there just because it sounds cool. NCIX!
I think the 1080ti will have GDDR5X but the Titan(?) will have HBM 2.0 to set it a part. I may be wrong but a man can speculate can't he...
As slow As safety
by
techslowly
And where is the reply to the videos topic? - What is GDDR5X?
You only explained what regular RAM is doing...
why should gddr5x be better than 512bit gddr5 used by amd? nvidia never used more than 384bit in last generations because is cheaper that way but is gddr5x better than amd vram in some way?
Luke´s Rudolph nose. New t-shirt though
So, if you play Low Graphical Demanding on GDDR5X like Minecraft, it will run it higher than 1080p60?
So it doesn NOT have High Bandwidth Memory, but it does have higher bandwidth memory... Seems legit.
Wow, Luke actually said "architecture" instead of "architexture". Celebrate, folks!!
0:33 - EXPLORE! EXPAND! EXPLOIT! EXTERMINATE! EXTRUDE! EXPLAIN! EXCLAMATION MARK!
im thinking this should be on the forum, but why does increasing the memory clock speed give better results than increasing the core clock speed in certain games, (happened in payday 2)
i still have 4 gbs of gddr3
1:01 it will make your game look better :D
Seriously, Anything below 60 FPS is just peasantly....
I can haz more FPS pls?
That's great and all, but PCI-e 3.0 at 16x is still only ~15GB/s. That's where your bottleneck will be.
i think that those led lights you use makes him look more pinkish. like his nose , mouth and palms.
This should have been called GDDR5Xplained.
You should do a video explaining graphics gard instability.
I have a crappy Gigabyte GTX 960 and it can't be overclocked AT ALL (if I try, it immediately gets artifacts and crashes) and I've always wondered why.
Maybe this 960 is factory overclocked. My Zotac GTX770 AMP! is heavely overclocked and on my specific Card there is no fucking way to increase the Core Clock, but my Memory can increase to an Offset of 300Mhz.
Do you tried to increase your Voltage Level, in Msi Afterburner you have to enable this function ;)
The only benefit of GDDR5X is that it is bad for mining performance and the miners avoid cards with it.
I just noticed that Techquickie is a separate channel. I always thought that it was just a playlist in the LTT channel.
I think Techquickie would generally be a much better channel there was some background music.
whats headphones sparkle ?? please make a video about this topic
I swear either they're not sleeping well. Or they're on drugs. >.> just looking out for you guys.
screwed up the color correction?
You wanna know something ironic? The ad I got before the video featured Linus...
why everyone is so hyped by HBM2?
if the core is shit the memory won't make any difference
Wow they barely talked about AMD. Totally sponsored by nvidia.
Those CS:GO Asiimov key caps look really cool
people weren't "confused" that we didn't get HBM2. we were dissapointed!
FIRST.
..............................
Well shit! I just bought a 980ti in November
nah ill wait for custom msi
Ummmmm no
A rip off edition 1080 is 700$, not 500. Why would you even buy the ripoff (founders) edition, it's just stupid in any possible level.
Dude I got my GTX970 in February
the 980ti overclocks more than the new cards anyway....if your watercooled your better off with the 980ti....at this stage
+redback209 Uhmmm no. A heavy oc 1080 is much better than a heavy oc 980ti.
Isn't Luke kind of yellowish today or is it just my monitor fucking with my eyes?
What's GDR5 and GDR5X that you mentioned a few times in the middle of the video? ;)
Bro! Sleep much! Dem bags under your eyes, you gettin worked by massa!
Rudolph makes a good host. Also press 2 for butts.