They need to clear out the junk 😆 Usb c display ports will be over 80gbs and blow hdmi 2.1 out of the water but I'm sure we'll see that standard in 10 years
Sounds cool but I don’t think will notice this until the Sending hardware the console or the PC is 2.1 and the cable is 2.1 and the monitor can interpret 2.1 correct so all three would have to be 2.1 compatible to really notice a difference
Thanks for the details. That 4K@120fps seems to me like it will be quite important for next-gen VR. PSVR currently has a 1080p screen inside that does 120fps - I imagine that if it's made, PSVR2 will likely replace that with a 4K screen with the same 120hz. With the resolutions and framerates that things like foveated rendering and checkerboard rendering would allow, along with temporal reprojection, 4K120 signals would be, if not mandatory, at least possible.
Second half of 2018 is VERY premature given the standard has JUST been agreed upon - so there's no displays, A/V receivers, graphics/video hardware that will support it. There's likely a new HDCP standard to come with it, rendering all the thousands of dollars just spent on fancy 4k gear for everyone useless, etc. etc.
0:57 - this video is totally wrong. You are talking about CHIPSET bandwidth, not CABLE bandwidth. The last cable revision was HDMI 1.4 High Speed. I do HDMI engineering for a living. They have had the same 3 TMDS channels for years and years. Look it up. Cable and chipsets have different revisions. Make a follow up video explaining all your errors.
So I get what what your getting at withthe chip sets, however would there be any benefits of getting a 2.1 cable to swap on to a 1.4 in terms of any transfer speed at all? Appreciate it, cheers!
You forgot to mention that HDMI spec changes are for the ports, not the cable. The cable is backwards and forwards compatible, but the devices with the ports have to be upgraded on both ends for new standards.
8k and 6k are the two normal {popular} RED camera resolutions. And they're used by top UA-camrs. 8k or higher footage allows for cleaner images when scenes need to be cropped or reformatted. When we get our Blu-ray discs, they are always converted versions of the originals, even when in 4K uhd. Filming in 4k and editing in 4k doesn't leave much room for correction. [At least that's what I've heard professionals say] Why do you think films cost so much to make these days. Michael Bay destroyed a $250,000 camera filming a scene in "The Island". That was over a decade ago, when we were getting exposed to 1080p.
i hate the term future proof. when tech is ready for me, its much cheaper. remember how much 1.4 hdmi cables cost when they first came out? remember how much 4k tvs were at 1st? when tech was ready for me it was way cheaper.
_"i hate the term future proof. "_ .. RIGHT, because when the future arrives it's based on a whole new crop of technologies that no one even remotely anticipated.
@@KakaOfTheRealMadrid even when you have thousands and thousands and tens of thousands in your bank its still hard to buy these things you feel empty for some reason after you buy them.
No, PS4 only does 1080p 60FPS, or 4k 60fps at max, and the console already comes with a 2.0 HDMI (or 1.4 if you didn't got the PS4 Pro) And dont worry, the PS5 already comes with a 2.1 HDMI, and an HDMI 2.1 is not needed for PS4.
Sorry to waffle on. But I have a PS4 pro, 4K HDR tv with 4K HDR av receiver (5.1). Whenever I’d put 4K HDR on I’d notice a faint flickering on the picture, went through about half a dozen HDMI cables. Anyway I bought the 8K 2.1 HDMI cables and the flickering was gone, slightly more consistent picture and panning the camera around in games was smoother. So basically if you have a all 4K PS4 etc setup. Get the 8K 2.1 HDMI cables. The normal 2.0 are fine for most but because of the 2.1 much higher bandwidth etc, it can handle full resolution and HDR much easier than the normal HDMI’s as the normal ones are “maxed out” so to speak and therefore can stutter now and again. Sorry to waffle, just thought I’d give my experience with them 👍 cheers
Yes same here! Haven't bothered buying the hdmi 2.1 cables themself yet as my 2080ti doesnt support it and there is no way around that at the moment! Waiting for next gen GPU's and hoping for hdmi 2.1 support, 4k 120hz baby
Lahiru Kudaligamage I didn’t get a warranty. I did notice serious bleeding and color issues. I called LG and they told me to take it back. No issues after returning the first one. They say to run through its paces once you get it and am glad I did that with the first one, and now the second one. Zero issues with the second one so far. I absolutely love it. Burn in is sort of a thing in past. If you vary your viewing habits (gaming, Netflix etc), you wont have any issues. Plus there are mitigation features built in that help with burn in. I don’t buy warranties as they are a waste of money. I usually just put all the balance on my cc to get an extra year of warranty through my cc. A rule of thumb is that if you don’t see any issues during the first year of owning any electronics, especially TV’s, you should be perfectly fine.
Great channel Ive been watching your channel for over a year now, I went back and watched some of your old videos but you really are killing it and the quality is amazing. Great job
2.0 is 2.0a and 2.0b is 2.1 people just like saying 2.0 because it's much easier and Microsoft call it 2.0b because the cable wasn't finalized yet during the time of xbox one X launch.
2.0b HDMI standard, is not 2.1. The 2.0b standard is an extension of the 2.0a to add support for dynamic metadata; specially support for HDR HLG. HLG is a broadcast variant of HDR, developed by the BBC & Japan's NHK.
Any high speed HDMI will carry this spec, got very little to do with the cables. Sure, don't pay pittance for your cables but don't pay through the nose either.
Or is the angle deceiving? I have a 55' TV, and it's not that wide! Please tell me what the hardware was, if it's still available, or have the laptops taken over?
It should work, depending on color depth and chroma subsampling. According to Wikipedia, compression potentially cuts the bitrate into thirds, and 4320p30 with 8 bits per channel and no chroma subsampling takes 24.48Gbps, which does not need to be compressed. As such, a compressed 4320p144 would take 40.88Gbps.
I know your initial message was sarcasm, but most people with respect to display tech have been awaiting this and DP 1.4 for YEARS, almost ridiculous how long this has taken, and the fact centers for certification will take months to get up and running for the HDMI 2.1 product FULL certification. The spec has been published yes, but how are they achieving any of the other things outside of the expected gains from bandwidth expansions. Things like ALLM/QFT/QMS ... An explanation of these and how they're going to work exactly would be very helpful. Oh and dynamic metadata driven HDR isn't "like" DoblyVision. The DolbyVision spec itself demands dynamic metadata for fully certified DolbyVision products (of which not one is even CLOSE to existing seeing as how the baseline 12-bit panels don't exist today outside of Dolby labs probably).
Didn't you say in the video that HDMI 2.1 cables aren't currently available? I searched up on Amazon and there's results for them, unless they're lying about their products?
Hello. I have an Onkyo 636 audio receiver with ARC. I also have an LG TV that supports ARC. I've connected everything properly and setup up all software options to the proper settings. Still TV is not recognizing the Onkyo or any other devices plugged into the Onkyo. I tried another TV with ARC (Vizio) and still no go. Is it possible that my HDMI cable doesn't support ARC? Is there a special HDMI cable I need? The cable functions properly transmitting video and audio from source to Onkyo and source to TV, but for the life of me I can't get ARC to work. Hopefully it's a cable? If not, maybe the ARC functionality of the Onkyo is toast? Any feedback you could give would be great.
It's not a big deal for all of us laptop owners who have yet to see HDMI 2.0 going mainstream even though it was released years ago now. Thunderbolt and DisplayPort have far more to offer in the laptop market...
Very few movies are shot in 8k actually. Red makes an 8k sensor and I think Sony has one too but neither of these are most popular for movie making. Arri Alexa's are the go to camera for many projects and many of them are not even shooting at 4K resolution. That said, shooting in 6K and down rezzing to 4 or 2k is quite common both for the purpose of reframing but also being able to deal with low end exposure and reducing noise in low light shots etc.
My experience was that my previous HDMI 2.0 cables(for previous version Apple TV and 4KTV) had audio/video stuttering problems with the new Apple TV 4K and my 4K TV. Bought the Belkin HDMI 2.1 cables, problem went away.
I've been around computers since the early 70s and the upper case 'B' was for Bytes and the lower case 'b' was for bits. So you say bits but you write Bytes. Which is it?
I think some clarification on the cable and the port should be your next video. I remember when 2.0 hit the market and read that the port determines the bandwidth not the cable and a 1.4 cable will support the bandwidth of a 2.0 port.
Many thanks for such an informative, and concise review. If you have, A high end device Ultra 4K Set, you need the best! Fantastic uploads, splendid channel 👌
You got many things wrong, hdmi 2.0 is a adaptor spec not a cable spec, it is defined by the adaptors within a device, you also answered a question about 2.0a vs 2.0b, a was first gen with hdr and b was second spec with HLG, all hdmi 2 needs is a hdmi 1.4 cable and it's fine, hdmi 2.1 needs a new cable but only for higher than 4k60. Do some research Muppet.
I've never actually seen the HDMI version mentioned on the cables or their packaging on any brand when I've looked them over in stores. And I did specifically look for versions. Bastards are hiding them, so if you need the higher version numbers you're left to guessing it will hopefully be the more expensive cables.
Just wondering which HDCP we'll get with this, messing with your set up? And, as ussual, you can't upgrade your set up for this. So it basically means, buy all new stuff.
What impact does this new standard have for data transmission distance limitations. I ask because it would be nice to run a cable like this 100 feet or more when wiring equipment far away from sources such as a native 4k projector? Thanks.
It would have been nice for you to have shown us how to identify what kind of cable we have. By showing us what to look for. I got a bunch of HDMI cables some gold plated. I want to know how to know what I got. For us less tech savvy folks.
you said 2.1 supports variable refresh rates, so is it like Mcable having a controller on the cable or what? is there any connection between the two perhaps?
Does this have any effect on 4k TV's being able to display movies and games in 3D? Or possibly even 8k TV's if they become standard sometime in the future? I don't know if there are any bottlenecks preventing that now but I am wondering.
Correction: It's Gb/s not GB/s (silly me...) ;)
The Tech Chap Bro, do you even review tech?
ikr! amateur hour over here hah
At least we learnt something today ;)
I find your lack of knowledge disturbing. Unsubscribed, blocked, flagged and reported to the Police.
😜 I'm not kidding.
Thank you for clarifying
Thought this video was like uploaded today, and turns out the industry has been talking about 2.1 for years now...wow I'm late
I hear ya'. I was just looking at TV reviews and they were talking about HDMI 2.1 for "future-proofing", now it seems I'm also late 🤦🏽♂️😅😷✌🏽
GB/s = Gigabyte per second
Gb/s = Gigabit per second
terribly sorry :P
also 1 Byte = 8 bits
apologiz tome I don’t see why not
Boat Hengtrakool noticed that
apologiz tome I know what. stop playing that shit game lol enter 2017 buddy. rainbow Six Siege exists for a way better competitive FPS
Surprisingly interesting stuff 😂
Mrwhosetheboss Didn't expect to see you here. Love you and your videos keep it up!
Joosua Anttila :D
Boss is here!
More interesting than your click bait videos of course.. so called "boss"
Hey make video on 2.1 monitors plz 🤩
Oh Tom, stop it ... I am going to burst from all the excitement !!! :P
This video is almost 3yrs old and there still isn’t any 4K120hz 1000nits hdmi 2.1 monitor for XSX and PS5......technology is sooo slow
They need to clear out the junk 😆
Usb c display ports will be over 80gbs and blow hdmi 2.1 out of the water but I'm sure we'll see that standard in 10 years
Just give companies time.
Also there's barely any 4k 120HZ games for XSX/PS5, that would need a ton of power, also a big price for the monitor
I think I read that some high-end 2017 tv's come with HDMI 2.1 ports, is that the same standart. Are they already supporting the 2.1 cables?
Sounds cool but I don’t think will notice this until the Sending hardware the console or the PC is 2.1 and the cable is 2.1 and the monitor can interpret 2.1 correct so all three would have to be 2.1 compatible to really notice a difference
Thanks for the details. That 4K@120fps seems to me like it will be quite important for next-gen VR. PSVR currently has a 1080p screen inside that does 120fps - I imagine that if it's made, PSVR2 will likely replace that with a 4K screen with the same 120hz. With the resolutions and framerates that things like foveated rendering and checkerboard rendering would allow, along with temporal reprojection, 4K120 signals would be, if not mandatory, at least possible.
Once cable boxes or stream boxes start pushing more live events in 4k, 30 fps is gonna look shitty with live sports
Second half of 2018 is VERY premature given the standard has JUST been agreed upon - so there's no displays, A/V receivers, graphics/video hardware that will support it. There's likely a new HDCP standard to come with it, rendering all the thousands of dollars just spent on fancy 4k gear for everyone useless, etc. etc.
Just bought new 4k RCA tv. But my old vcr/dvd has NO. HDMI. Please tell me how I can use it as I have no more monety
0:57 - this video is totally wrong. You are talking about CHIPSET bandwidth, not CABLE bandwidth. The last cable revision was HDMI 1.4 High Speed. I do HDMI engineering for a living. They have had the same 3 TMDS channels for years and years. Look it up. Cable and chipsets have different revisions. Make a follow up video explaining all your errors.
So I get what what your getting at withthe chip sets, however would there be any benefits of getting a 2.1 cable to swap on to a 1.4 in terms of any transfer speed at all?
Appreciate it, cheers!
@@charlesb7831 no benefit. As op mentioned this vid has so many things wrong.
wtf are u talking about
There will be a new ultra high speed cable for 2.1. I believe that is in the standard.
@@charlesb7831 no, because your tv/console/blu-ray player needs to be able to process it aswell.
Subscribed, despite the MAJOR Mb/MB hiccup. Good on you to address it after. Stay sharp. I'm excited for the future of this channel.
+Kao Saechao thanks 🙂
This is hands down, the best explanation of these cables that I've heard thus far! Thank you.
You forgot to mention that HDMI spec changes are for the ports, not the cable. The cable is backwards and forwards compatible, but the devices with the ports have to be upgraded on both ends for new standards.
This is what I was thinking. The cables are judged by the "high speed", and "ethernet" names arent they?
8k movies ? I think producers will need whole server room to edit movies and post production thing
Even MKBHD shoots in 8K once in a while I guess.. maybe I'm wrong.
Damodar Keny Lol Christopher Nolan shoots higher then 8K
8k and 6k are the two normal {popular} RED camera resolutions. And they're used by top UA-camrs. 8k or higher footage allows for cleaner images when scenes need to be cropped or reformatted. When we get our Blu-ray discs, they are always converted versions of the originals, even when in 4K uhd.
Filming in 4k and editing in 4k doesn't leave much room for correction. [At least that's what I've heard professionals say]
Why do you think films cost so much to make these days.
Michael Bay destroyed a $250,000 camera filming a scene in "The Island".
That was over a decade ago, when we were getting exposed to 1080p.
compression moves along with resolution increases. better compression = less data
he is just playing smartass. 1080p was a thing that would never be real some years ago and now you edit 4K on phones. ignore smartass comments.
i hate the term future proof. when tech is ready for me, its much cheaper. remember how much 1.4 hdmi cables cost when they first came out? remember how much 4k tvs were at 1st? when tech was ready for me it was way cheaper.
Richard Recupero yeah because ur broke as fuck 😂 doesn't mean everyone feels that way or can relate to ur specific financial situation
I wouldn't mind just getting me some of the present right now - my 2018 4k i7 laptop with a 1060 doesn't even have hdmi 2.0..
_"i hate the term future proof. "_ .. RIGHT, because when the future arrives it's based on a whole new crop of technologies that no one even remotely anticipated.
@@KakaOfTheRealMadrid even when you have thousands and thousands and tens of thousands in your bank its still hard to buy these things you feel empty for some reason after you buy them.
Remember when it was just a damn cable?
Will these 2.1 cables work with standard HDMI inputs or do I need to buy a new monitor?
The cables will be backward compatible, but you do not need to buy 2.1 cables if your monitor or TV does not support it.
I just bought the LG C1 55 " but still only have a PS4.....Do i need or should i get the HDMI 2.1 cable ????
No, PS4 only does 1080p 60FPS, or 4k 60fps at max, and the console already comes with a 2.0 HDMI (or 1.4 if you didn't got the PS4 Pro)
And dont worry, the PS5 already comes with a 2.1 HDMI, and an HDMI 2.1 is not needed for PS4.
Sorry to waffle on. But I have a PS4 pro, 4K HDR tv with 4K HDR av receiver (5.1). Whenever I’d put 4K HDR on I’d notice a faint flickering on the picture, went through about half a dozen HDMI cables. Anyway I bought the 8K 2.1 HDMI cables and the flickering was gone, slightly more consistent picture and panning the camera around in games was smoother. So basically if you have a all 4K PS4 etc setup. Get the 8K 2.1 HDMI cables. The normal 2.0 are fine for most but because of the 2.1 much higher bandwidth etc, it can handle full resolution and HDR much easier than the normal HDMI’s as the normal ones are “maxed out” so to speak and therefore can stutter now and again. Sorry to waffle, just thought I’d give my experience with them 👍 cheers
One of the BIG reasons why I purchased the LG C9.
I would do the same but I'm broke for oled tv lol
Yes same here! Haven't bothered buying the hdmi 2.1 cables themself yet as my 2080ti doesnt support it and there is no way around that at the moment! Waiting for next gen GPU's and hoping for hdmi 2.1 support, 4k 120hz baby
Haykall I just ordered one myself too...how is it holding up??? I got 5 year warranty on it cuz of possibility of burn in
Lahiru Kudaligamage I didn’t get a warranty. I did notice serious bleeding and color issues. I called LG and they told me to take it back. No issues after returning the first one. They say to run through its paces once you get it and am glad I did that with the first one, and now the second one. Zero issues with the second one so far. I absolutely love it. Burn in is sort of a thing in past. If you vary your viewing habits (gaming, Netflix etc), you wont have any issues. Plus there are mitigation features built in that help with burn in. I don’t buy warranties as they are a waste of money. I usually just put all the balance on my cc to get an extra year of warranty through my cc. A rule of thumb is that if you don’t see any issues during the first year of owning any electronics, especially TV’s, you should be perfectly fine.
Haykall ah nice...thanks for the reply...can’t wait to have it in my house
Great channel Ive been watching your channel for over a year now, I went back and watched some of your old videos but you really are killing it and the quality is amazing. Great job
+jordanmonaghan8 thanks! 😀
Why don't they just call it HDMI 3.0, then? Is it just a change in bandwidth so it's considered the same technology?
Very helpfull short and simple and straight to the point. Will look forward to more vids
So it doesn’t improve when ur using a Full HD or HD Ready Tv when ur using this cable?
Man that’s a skill
and what about HDMI 2.0a and 2.0b?
2.0 is 2.0a and 2.0b is 2.1 people just like saying 2.0 because it's much easier and Microsoft call it 2.0b because the cable wasn't finalized yet during the time of xbox one X launch.
2.0b HDMI standard, is not 2.1. The 2.0b standard is an extension of the 2.0a to add support for dynamic metadata; specially support for HDR HLG. HLG is a broadcast variant of HDR, developed by the BBC & Japan's NHK.
I don't know why movies on bluray couldn't all be 1080p @ 60fps, instead of the inconsistent shit we are accustomed to...
Chill Bill exactly
I expected you would change the 2.0 : 48gbps to 2.1 : 48gbps on the transition. You delivered!
Cant wait to buy a 1m cable in Currys for about £200
I doubt it'll be much more expensive for the average consumer
Tell that to the sales staff at Currys when selling their TV's trying to sell their vastly overpriced HDMI cables
Any high speed HDMI will carry this spec, got very little to do with the cables. Sure, don't pay pittance for your cables but don't pay through the nose either.
he i wil wait for the aldi version of 4 dollars....same quality...its digital...
kpatrickm 😂😂😂
Why did you use the cables as they aren't different? It's the ports until we get to 8k. Then we need a cable upgrade.
does HDMI 2.1 have enough bandwidth for 4k, 120hz, 4.4.4 chroma, 10 bit HDR and variable refresh all at the same time?
Most likely
One of the best UA-cam channels, keep up the good work bro.
I really hope everyone pushed for hfr in movies especially action movies
Awesome video, thanks for the update brother 🤙🏿 keep up the solid content 😁
Avery Ross how did you make the hand brown?
It’s finally here with the new gpus and LG tvs in 2019.. good stuff
Strange to be talking about 8k when Freeviw Sky and Virgin broadcast at 1080p here in the UK.
So will DisplayPort get an upgrade, or will it just fade away?
All I care about will HDMI 2.1 allow 4K HDR@120fps in 4:4:4 chroma?
You're mixing acquisition (4:4:4) with playback
Or is the angle deceiving? I have a 55' TV, and it's not that wide! Please tell me what the hardware was, if it's still available, or have the laptops taken over?
Nice, when do pre orders start?
How wide is your TV, or is it old IT hardware?
So still it cannot support 144Hz?
xXSilentAgent47Xx only 120
It should work, depending on color depth and chroma subsampling. According to Wikipedia, compression potentially cuts the bitrate into thirds, and 4320p30 with 8 bits per channel and no chroma subsampling takes 24.48Gbps, which does not need to be compressed. As such, a compressed 4320p144 would take 40.88Gbps.
I know your initial message was sarcasm, but most people with respect to display tech have been awaiting this and DP 1.4 for YEARS, almost ridiculous how long this has taken, and the fact centers for certification will take months to get up and running for the HDMI 2.1 product FULL certification. The spec has been published yes, but how are they achieving any of the other things outside of the expected gains from bandwidth expansions. Things like ALLM/QFT/QMS ... An explanation of these and how they're going to work exactly would be very helpful.
Oh and dynamic metadata driven HDR isn't "like" DoblyVision. The DolbyVision spec itself demands dynamic metadata for fully certified DolbyVision products (of which not one is even CLOSE to existing seeing as how the baseline 12-bit panels don't exist today outside of Dolby labs probably).
The Xbox one x has an HDMI 2.1 not a 2.0 doesn't it? Im sure free sync already works with a few games.
Thanks for the explanation. I was looking at a new Yamaha receiver that didn't have 2.1, but I didn't know the difference. Now I want 2.1.
Didn't you say in the video that HDMI 2.1 cables aren't currently available? I searched up on Amazon and there's results for them, unless they're lying about their products?
wondering the same thing, cant find a certified one yet tho so they might be false specs
Scouse ATOM belkin is the only one so far
Great edit, @ 2:10 shows HDMI 2.0: 48 GB/s @ 2:22 shows HDMI 2.1; 48 GB/s nice save.
Hello. I have an Onkyo 636 audio receiver with ARC. I also have an LG TV that supports ARC. I've connected everything properly and setup up all software options to the proper settings. Still TV is not recognizing the Onkyo or any other devices plugged into the Onkyo. I tried another TV with ARC (Vizio) and still no go. Is it possible that my HDMI cable doesn't support ARC? Is there a special HDMI cable I need? The cable functions properly transmitting video and audio from source to Onkyo and source to TV, but for the life of me I can't get ARC to work. Hopefully it's a cable? If not, maybe the ARC functionality of the Onkyo is toast? Any feedback you could give would be great.
What do you think about Thunderbolt 3? How does it compare with HDMI 2.1?
Thunderbolt 3 can support upto 40Gbps.
Great video, thanks for sharing.
Is my 7115B ps4 pro even capable or patchable with 2.1?
It's not a big deal for all of us laptop owners who have yet to see HDMI 2.0 going mainstream even though it was released years ago now. Thunderbolt and DisplayPort have far more to offer in the laptop market...
Very few movies are shot in 8k actually. Red makes an 8k sensor and I think Sony has one too but neither of these are most popular for movie making. Arri Alexa's are the go to camera for many projects and many of them are not even shooting at 4K resolution. That said, shooting in 6K and down rezzing to 4 or 2k is quite common both for the purpose of reframing but also being able to deal with low end exposure and reducing noise in low light shots etc.
So even without any hardware change the cable will make a difference on frame transport and latency on the xbox one x 2.0 to a 2.0 tv?
Actually Xbox one x does support Hdmi 2.0b/Hdmi 2.1
My experience was that my previous HDMI 2.0 cables(for previous version Apple TV and 4KTV) had audio/video stuttering problems with the new Apple TV 4K and my 4K TV. Bought the Belkin HDMI 2.1 cables, problem went away.
What about using usb c to usb c like most computers are slowly going to over HDMI ?
So is it worth waiting for a 2.1 monitor to come out for the ps5 ?
Hey man, quality tech videos, subscribed
Is the rocket fish good for the Apple TV 4K connecting to Samsung Qled?
Is there a true 4K 2.1 gigabyte HDMI cable you can recommend
Nice keyboard you got, which one is that?
Please can you make an in depth review of the iPad pro 10.5 Vs Samsung Tab S4, to know which one is better for me. Thanks
I've been around computers since the early 70s and the upper case 'B' was for Bytes and the lower case 'b' was for bits. So you say bits but you write Bytes. Which is it?
I think some clarification on the cable and the port should be your next video. I remember when 2.0 hit the market and read that the port determines the bandwidth not the cable and a 1.4 cable will support the bandwidth of a 2.0 port.
So all we need is just the cable? It will work in any hdmi slot?
Many thanks for such an informative, and concise review.
If you have, A high end device
Ultra 4K Set, you need the best!
Fantastic uploads, splendid channel 👌
I couldn't find any 2.1 HDMI cables on your USA amazon website??? Which ones do u suggest please?
Hopefully we will be able to buy hdmi 2.1 converters that keep sync with the gpu for people who already got 4k's
Ultra we will
So the problem is , is some cables claim to be 2.0 how can we be sure we are getting 2.0 or 2.1 cables? can we count the pins or something?
When will it come out?
Or is it allready out and where can i buy it?
You got many things wrong, hdmi 2.0 is a adaptor spec not a cable spec, it is defined by the adaptors within a device, you also answered a question about 2.0a vs 2.0b, a was first gen with hdr and b was second spec with HLG, all hdmi 2 needs is a hdmi 1.4 cable and it's fine, hdmi 2.1 needs a new cable but only for higher than 4k60. Do some research Muppet.
Why don't you make the video
Lmao salty over a hdmi cable
Hes a Teppum a backward muppet
Is Promate HDMI 2.1 good ? Any one have an idea
You kept saying gigabits but wrote GB which means gigabytes, (lower-cased b = bits) which is 8 times more. Which is it?
Am I right in thinking Samsung 2017 K8000 will get this with a update and XB1X already supports it...
lddlmurphy I believe so definitely on the x yes
How can you tell the difference of what cable you have?
So the question is. Is display port still good for PC gaming still? With HDMI 2.1 coming out 48GPS that's a good question.
Thank you tom much appreciated
I've never actually seen the HDMI version mentioned on the cables or their packaging on any brand when I've looked them over in stores. And I did specifically look for versions. Bastards are hiding them, so if you need the higher version numbers you're left to guessing it will hopefully be the more expensive cables.
Just wondering which HDCP we'll get with this, messing with your set up? And, as ussual, you can't upgrade your set up for this. So it basically means, buy all new stuff.
Catch 22: they make no promises with HDCP.
Will a new Motherboard be required to make use of HDMI 2.1? Thanks for the Video.
What impact does this new standard have for data transmission distance limitations. I ask because it would be nice to run a cable like this 100 feet or more when wiring equipment far away from sources such as a native 4k projector? Thanks.
Do you need your monitor/tv to support 2.1 for it to work, or just the cable? Assuming the TV already supports 120 frames
cable is just a cable. with high output, but still just a cable. so yes, whatever hardware you have, it will have to support the standard.
It would have been nice for you to have shown us how to identify what kind of cable we have. By showing us what to look for. I got a bunch of HDMI cables some gold plated. I want to know how to know what I got. For us less tech savvy folks.
Any different between cheap HDMI & expensive HDMI for monitor¿
So how will this affect displayport this may have been answered already.
you said 2.1 supports variable refresh rates, so is it like Mcable having a controller on the cable or what?
is there any connection between the two perhaps?
What monitor are you using?
You're a very good presentator, subscribed!
So is there anything out now that supports HDMI 2.1? Tv's, monitor's?
Whats the width of ur desk ? Order an Ultrawide x34 to be exact. Dnt wannna sit close to it.
Is there an improvement in cable length?
Does this have any effect on 4k TV's being able to display movies and games in 3D? Or possibly even 8k TV's if they become standard sometime in the future? I don't know if there are any bottlenecks preventing that now but I am wondering.
You're a good old blighty, Tech Chap! Good video (save for the GB/Gb thingy).
How do i know if my tv will support a 2.1 hdmi cable or only 2.0 or does that not make a difference
What about usb c to take over HDMI would that be something that could or possibly happen as heard the speeds are very huge ?
So what’s the difference in the cables?
10/100 cat 5e are not anymore used or just for PCs not for these gaming devices?
Can we convert USB-A to HDMI 2.1 or DP?