My guess is that because HDMI is a proprietary standard, there may be licensing issues with creating an HDMI port that isn't specifically how HDMI wants it to be.
I believe I heard that on some videos about the HDMI forum not letting AMD put HDMI 2.1+ support on Linux open source drivers. kinda unrelated but if the HDMI entity doesn't like your HDMI thing, you can't put HDMI logos on anything related to your HDMI product thing
Yeah you need to pay license fees for HDMI. You don't need that for displayport so its a lot easier to add to a system. Fun fact; the main reason HDMI is used for TVs is because it supports DRM. The only problem is that its implementation is hilariously bad. You can remove the DRM by just using a cheap Chinese HDMI splitter. Company I worked for used to sell devices that took HDMI as input but didn't support the DRM features. So the easy fix was just chucking a splitter inbetween. Industrial equipment, didn't need DRM so it wasn't an issue to their customers.
@SmallSpoonBrigade Aside from silly rumours, do you have any proof to back up that outlandish claim? I have used dozens of USB C cables over the years, and not once did I ever have equipment die because of it. The only issue I can see is buying cheap USB C cables from untrustworthy suppliers, but that's usually a terrible idea for electronics in general.
@@StardustLegacyFighternot sure if it's a joke but it seems like something the HDMI forum would do. AMD cards can't output a high speed HDMI signal on Linux because the HDMI group wouldn't let them release an open source driver that implements the new HDMI spec
Whoever designed this device was clearly a fan of this sort of thing, because it also includes a whole bunch of those wacky dual eSATA/USB ports too. Honestly surprised it doesn't have some of those combo 3.5mm/SPDIF ports that put the optical transceiver in the back of the analog jack.
My Samsung TV has this thing. I fact all my TV has is like 7x 3,5mm connectors and it comes with a box full of adapters. It also has 3D. So you can tell it is something like 15 years older. But also has HDMI and supports 12bit (not often seen nowadays) 1080p 60hz
I remember seeing these and asking questions of Xi3 at CES in the early 2010s. They really wanted to fit as much USB on these as they could because they saw a future in which literally everything was plugged in through USB because Bluetooth was still wonky. Bt got a lot better but I still use a powered USB hub to plug in all my years of USB stuff to my 2021 gaming rig.
@@rhettigan bluetooth become lil better but not so much. Keyboard and mouse polling rate is 125Hz - it possible to make 250/500/1000hz polling rate hid device, but nobody make it. Only PS4/PS5 gamepads has 500hz pollin rate. Even now BT don't have standart protocol for headset with stereo+mic working at same time. No zerolatency or lossless audio protocol either. Sadstory
I wonder if the license costs more per port. Cuz on GPUs, they use both HDMI and displayport. It would be cool if all 4 or 5 ports on a GPU were just these, so you could pick either one instead of only having 1x HDMI and 3x displayport or something
@@JonBringusAMD just got shut-down by the HDMI asshats for trying to include part of the HDMI 2.1 spec in their open-source Linux drivers. Intel and maybe Nvidia have gotten around this by simply including a DP-to-HDMI converter on board.
@@HululusLabsI don’t remember where but I swore I read that intel does it that way but Nvidia does it some other way where it’s on the GPU die. Can’t remember where tho so take it with a grain of salt
I swear 15 years ago I was selling PCs at a store that had that combo port. It got me in trouble as I was new to it all and then assumed every DP port was HDMI compatible so would tell people to buy whatever cable they wanted for their DP ports "yeah bro, HDMI fits 100% maybe you just have to force it a bit".
Displayport has interoperability modes for DVI and HDMI. I didn't think they were pin compatible but the DP to HDMI and DP to DVI-I adapters are passive. The DisplayPort does all the magic. For DVI-A or VGA Displayport has those analog signals as well where HDMI doesn't.
DVI and HDMI are electrically identical. Some gfx cards even have the ability to drive sound through their DVI ports if the other end is detected as a HDMI display
I swear I can't understand how a static cable can break out of freaking nowhere. HDMI ones do here. Never understood it. Got my chunky DVI here running plain 1080p 60 Hz beautifully as it should, for years same cable. Got some sad tugs on it and still fine. Look at an HDMI cable with a frowning face and it dies of sadness.
i (perhaps weirdly) always liked HDMI and i bought the cheapest cables haha. what was so bad about it? I am on displayport now, but always thought HDMI was a great port personally.
@@incubrian It's terrible. It's a pain to get RGB out of it, it always wants to do stupid chroma subsampling crap because it sucks for bandwidth. If you were getting away with buying the cheapest cables you must have been running 1080p. Also there's a really obnoxious licensing fee to use HDMI. DP is better in every way.
@@incubrian ... *how* I remember buying the cheapest HDMI cable I could find a good few years ago to use with my PC The amount of issues I had with that cable was insane, dead pixels EVERYWHERE, Incorrect hz support, everything was just, *wrong* Had to go back and buy a cable that was 50% or double as expensive, just to get something that worked at all Ever since I got my hands on displayport cables I've never looked back... No issues of any kind whatsoever, only thing I can really complain about would be the locking mechanism, as sometimes on some devices it'll be almost impossible to get at, and on some cables it'll be flimsy or "Soft" making it all the harder to unlock, would still rather have that than a cable that doesn't work at all though :v
We require these on modern graphics cards, so you don't have to worry if the monitor you buy has DisplayPort and HDMI. All you need is an HDMI or DisplayPort cable and you're ready to go. This should become an standard like this is just so smart. Imagine seeing and ad that is like "You have One HDMI and Three DisplayPort connectors on your Graphics card not anymore because this new (Put Graphics card brand and model here) Has four 2 in one port that can accept both DisplayPort and HDMI buy (Put Graphics card brand and model here) now for only (Put pirce here)"
To be fair modern graphics cards usually come with two display ports and two hdmi so unless you want a ton of monitors you should be good without any adapters
@@25566usually 3 displayport one HDMI in my experience, at least currently. there was a short window where it was an even split but displayport took over years ago on the PC side, with the single HDMI port being there for compatibility, and to an extent vr headsets. every so often I'll see displayport+HDMI+DVI on lower end cards but that's the exception, not the rule
i give you a thumbs up for making this a 2 minute video instead of 20 mins with swooshing title intros and bloaty narration. you are like this port. awesome.
Don't forget about the begging for likes, subscriptions, and even insisting that people use UA-cam's obnoxious notification system. That type of behavior is usually enough to get me to not subscribe in the first place.
the reason this isn't done more often is cuz HDMI sucks ass (the people that run it), they make you pay a license fee for every HDMI port, also they blocked AMD's opensource HDMI implementation
It's probably electrically a DP++ port, ie those common DP ports that you can use a cheap passive adapter with to get HDMI. The problem with those is that you generally only get pretty low level HDMI, so 1920x1200@60 max, so not really anything you want outside of common office tasks.
"pretty low level HDMI" is 1000% on the HDMI consortium. It must be propriety, it may not deviate from any part of anything, they have you by the balls if you signed the NDAs, etc. You CANNOT produce a DisplayPort connector that outputs HDMI at full 2.1 specs. You CANNOT produce code that implements the spec if it's not proprietary (the way Intel and Nvidia do.) If you break the contract you signed … HDMI is basically awful and needs to go away.
8 місяців тому
This is my guess too. It just can do the level shifting without an adapter.
@@knghtbrdWell you probably can write such code or make such hardware if you're not in HDMI Consortium. The thing is, you'd have to reverse engineer everything, which is a bit hard. And then you'd have to survive the lawyers assault, and prove that your reverse engineered solution doesn't use their classified documentation. You'd also not be able to use HDMI name and symbols (trademark violation), you might need to skirt around some patents, and court fees might still end you, even if you win tl;dr - you can in China.
@@knghtbrdjust get the NDA signed by a hobo, grab the code, then use a 3rd world company to produce the thing, and flood the market with it. The consortium will be able to find the NDA violator begging next to a station somewhere, but that's all. Their lawyers will be screwed up, and all the specs and sources will be open, products will came to consumers by channels like temu, wish or ali. Other way is to hire an employee fired by them, who knows all about how the newer specs work,.then develop a "for use with" code...
HDMI and DisplayPort are electrically compatible with simple adapters. Heck, I even have cables that are HDMI and one end and DP on the other. So while this isn't that mind blowing, it is really cool to see that someone found a way to make a port that accepts both connectors.
HDMI and DVI are electrically compatible (Same kind of signal but using different plug), but DP is NOT using the same kind of signal as DVI/HDMI The socket in this video have a switch inside that can detect the type of the plug and output the right signal
about a year ago i was working for an automation company that used industrial mini PCs, from a company called DFI. they had these little combo ports and i was just as surprised! only bummer is it didn’t work with certain monitors, never figured out if it was a DFI exclusive problem…good video!
DP is natively compatible to HDMI btw. Any DP port can do HDMI, as long as you give it a passive pin to pin adapter (with a resistor on the adapter detect pin to tell the GPU this port is in HDMI mode) So what I assume is different about this one is that, besides having shaven off some or the plastics of a DP port to fit an HDMI cable, is that it has a switch that detects if the cable plugged in is slightly smaller (aka an HDMI cable) and automatically does the switching that a passive adapter would normally have to do... Kinda funky tbh. Tho as others mentioned, DP is an open standard, while HDMI requires licensing fees, so the passive HDMI compatibility on a DP port always depends on what the GPU manufacturer/driver has built in and isn't necessairly linked to the DP spec of the same port... (and some might opt to drop HDMI compat to save costs, in which case an active adapter is needed)
Not so fast. Not ALL DP had that functionality! The very first or the cheapest DP, needed ACTIVE adapters to convert to HDMI, as the STANDARD did not include those features initially.
@@pepeshopping I did mention that towards the end. The DP spec says that the port *can* carry HDMI directly, but because HDMI is a payed standard and DP is not, they cannot require support for HDMI, so to save costs cheaper devices can opt to only support active adapters
you need to specifically implement the optional dual mode displayport feature for compatibility with passive adapters but at this point that is so ubiquitous manufacturers don't add the logos for it anymore, it is not something displayport has by default and does require extra hardware
actually, type-c to displayport adapters (that use the alt mode, rather than the weird displaylink ones that dont work right except on windows) dont actually provide this, so if you are then daisy-chaining them with a passive displayport to hdmi adapter or cable (aka all the displayport to hdmi cables in existence) they dont work
If the laptop already has DP, it would be more expensive to implement this as you'd have to pay a fee. If the laptop had HDMI, implementing this would limit the HDMI to an older version of HDMI. Meaning max 1920*1200@60hz (I'm pretty sure).
Display Port is an open standard and HDMI isn't. My guess is the group/company that controls HDMi and collects money from device manufacturers for any device they sell with HDMI would be very against this. They won't allow AMD to implement the lastest HDMI standards in Linux because they are more worried about keeping the actual spec secret then having something that works. Intel and Nvidia got around this by putting a DP to HDMI chip behind the HDMI port on their gpus/mbs so that they are outputting dp from the gpu and then converting it to HDMI before it gets to the port with the chip. So the only reason this probably exists is the company that made it was small enough that no one noticed what they were doing before products like this were released.
I am almost sure the HDMI consortium forbid any of its members to even think about it if they want to have HDMI on any of their future products. HDMI is a cancer just like Volkswagen. They slow down progress so they can deliver the bare minimum for ages
It looks like maybe this sacrifices Pin 20 of the DP connector. This pin only provides power to electronics embedded in the cable itself (e.g. active converters) and is unused (like, not even connected to anything) in passive cables. Maybe all the pins in DP++’s HDMI mode are in the same order as HDMI pins?
the ML lanes are at same location as hdmi's channels, but by the end of the cable the DDC vs AUX and few other signals are swapping positions with grounds - which likely makes the autodetection possible. The pin 20 sacrifice is correct, that bears the power to the DP++ adapter, which is in this configuration embedded into the host board. Few muxes and its done. Very clever indeed! I would say HDMI stole the DVI signalling.. and DP stole the HDMI pin layout.. until they got mad at each other and the notches were introduced to code the DP plug away (like DP-only, not bi-protocol..) EDIT: looking at the REGO connector linked below by a commenter - they use a mechanical switch for cable type detection :D
as you can see, the pins inside are wider. And you can detect by a small microcontroller what hot plug pin is active(DP or hdmi) then you can just switch a multiplexer switch.
0:28 thats just displayport it can passively adapt to hdmi and dvi. as can hdmi to dvi, with dvi to hdmi being compatible on ports specifically allowed for it. like it will pretty much send the signal straight over the wire. you can even get audio out of a dvi port in hdmi mode lmao im pretty sure their running displayport in hdmi mode with this and have this funky ass connector and disabling auto switching turns this feature off thats why dp is king being able to act as 3 different connectors with cheap adapters and adapter cables edit: nvm this isnt dp spec but just something very common in anything remotely modern. its something called dp duelmode
I'll correct you a bit about dvi port, because there's 3 types of them, there's the oldest DVI-A,the combined DVI-I, the digital DVI-D and all of them has different types of signals and they have different passive adapters D, I can have passive HDMI/display port adapters, A, I can have VGA passive transformer because these output analogue signal, basically you can have passive adapter if it outputs same type of signal, doesn't matter it's dp,HDMI, vga,Dvi-X or even thunderbolt, because these can be straightly translated to dp or HDMI or dvi.
eSATA in a combo port with USB are simply 2 different standards with their own different pins, that happen to be similar in size so they can be combined in the same space, but each still has their own format and signals.
The Brenda in finance jab was hilarious, we all know it's true. I say this as a computer engineer, it's my job to make of finance/business majors. Well, that and make computers, I guess... but mainly the first thing
This should be on all monitor. If they make the connector twice as high as this, while one half is HDMI and one half is DP, it would be good as well. The main point is that if you use one type of connection, you would not able to use the other one, so both connectors are connected to the "same" monitor output, but you could freely decide if you want to use DP or HDMI. Put this on monitors and TVs as well to make it more convenient (If I happen to use a TV as a monitor I would use DP instead of HDMI. Especially with more than one TV, as the video card has only one HDMI out)
After thinking about it: It makes sense to differentiate the two connectors, als people tend to try plug in anything that seems to match (even USB into ethernet RJ45...) and wonder, when it doesn't work, just because they don't understand certain differences. At the moment, USB-C falls into the trap of one universal plug type, that could do pretty much anything from simple charging, analog audio, USB2, USB 3.x, HDMI and Thunderbolt 3/USB 4 ... though: the point is: there's a dozen of subtle symbols barely recognizable on the devices and unknown to most customers, and on the other hand: in the stores you find a million USB-C cables, 90% being stupid outdated "highspeed 480MBit/sec USB2 charger" cables but no hints on the cable itself, what it can do, and why some (TB3/4, 100W etc. are 10x more expensive than the funky colorful ones) ... I mean ... WTF? The people behind USB did a perfect job to multilateral confusion. Displayport cables do exactly what their second ends (big DP, miniDP, HDMI) do look like, without confusion of misunderstandings, while USB-C implies non-existing compatibility and continuous questions.
That's one of the reasons I dislike USB-C. Although it *can* be great for interoperability, it's poorly executed and leaves most people confused. A colourcoding system or some system with symbols should be implemented to differentiate between the different types (just like how "normal" USB was colourcoded).
@@lajawi. Exactly. Color codes for supported protocols (USB2/3/TB)/speed + data/charge mode + clear, distinguishable symbols on the plugs AND the cable + maybe LEDs on the cable or the plug, which mode has been negotiated. Also, no current OS offers some info dialog, which shows probably suboptimal, restricted cabling (or hubs) after plugging in another device. Another feature they could have added: a button for "I want to eject this device" + general driver/OS function. The 21. century didn't quite reach all developers, so far. They had all options, but failed again. I miss Firewire, which did just plain work.
@@jnzooger TB1 never made it into the wild. The early prototypes were called Lightpeak and featured optical cables, before switching to copper transceivers.
Fun fact, for a long time as a teenager I used to think you could just plug HDMI into DP because the school computers had these cursed ports in them, and all of the TVs and monitors in my house still used component and VGA.
the funny thing is the only thing they did was break the nub off. DP ports and HDMI ports are both little different from plugging a mouse and a flash drive into the same port. it doesn't care what the label is because its a digital output that says what the device wants. hence why HDMI/DP adapters will output what the monitor expects no matter which way it goes. despite the wrong plug being used its not even a wierd wiring job, its basically just the same cable slightly modified
As someone who's never heard of a display port before, this video seems kinda funny haha. If I see something that says it's a HDMI port, the first thing I'm gonna do is stick a HDMI cable in it
This computer was awesome but kinda obsolete when it came out. They are works of art to me for what they tried to be. So it is super cool that the engineer still talks about this. I've heard that hdmi and display port are the same protocall. Monitors don't usually need sound but honestly it would be nice if I could connect speakers to a monitor and use the hdmi cable to put the computer further away
I think i may be the only person who was fully aware this was coming and in not surprised at all... first time i saw DP port i thought - i bet they will do exactly this.
If you noticed the regular DP port has two rails on the outer end of the copper while HDMI does not, im sure there is funky switching going on internally for the pins to correctly work but one could surmise that if you lobbed the pins of a regular DP port an HDMI plug would fit... just not work and i think thats what makes this interesting really.
After reviewing its datasheet, it appears that the entire pin configuration will be altered based on the condition of pin 21 (presumably the metal tab on the right side of the plug). If pin 21 is open, the HDMI pin configuration should be utilized, and vice versa. Essentially, it is two ports sharing the same connector. In other words, you can actually have 2 ports (1 DP and 1 HDMI) instead of 1 combo port like this, which might be the explanation for why it didn't become popular.
Well, the technologies are very similar which you can tell from the existence of simple converters/cables. However, the industries that support each technologies are different. Computer industry supports DP while TV/entertainment industry supports HDMI. They were not exactly the same in terms of business and price models. However, they've recently crossed the roads and more discussions are on the way.
Holy shit... It actually works. Thank you for showing us this! I still don't think I fully understand how, but I wish this became the standard! Did he mention any limitations?
Okay, HDMI and DP being pin compatible is news even to me. Putting aside the keying in the standard DP, they are also electrically incompatible BUT I know that DP was designed with HDMI compatibility in mind, your usual GPUs out there can switch from DP signalling to HDMI signalling, given any device is connected via a passive DP to HDMI cable (or DP to DVI, since DVI and HDMI are even eletrically compatible, since HDMI is just a buffed version of DVI with a different physical connector)
fyi, display port fits hdmi generally anyhow. The size of the connectors is a lot closer than most people think. It's loose unless you do what the xi3 did of course, but the big thing is.. display port is keyed, remove the keys and it's actually straight up compatible like you just found out!
Considering the HDMI forum's block on AMD's ability to offer better refresh rates for higher resolutions, this SHOULD be more common just so DisplayPort can be more accessible for everyone.
DP has a key at both ends of the connector strip. HDMI has none. That's the only reason a HDMI cable won't plugin to a regular DP. Signaling however is another matter.
@@mjetektman9313 DP video cards detect and switch to DVI with HDMI protocol when you plug-in a HDMI cable. They are definitely not compatible signal wise. There's no such thing as a passive adaptor; it only appears passive because of the expensive switching circuitry at the DP jack .
I bet it's down to the logo licensing. DP people won't sign off as meeting the spec if the socket doesn't meet the spec, and HDMI folk will say the same. We may see this in niche applications but if DP/HDMI allows anyone to go off-spec or just carry on anyway then sooner or later manufacturers will stop paying for certification, corners will be cut and the world will end in a puff of magic smoke.
Well considering the kerfuffle that AMD cards are having with the HDMI standards forum (or what ever they are called) where they just WILL NOT let them have full up to date HDMI compatibility on Linux. I am assuming that it is their fault and it wont happen any time soon.
They don’t do it because HDMI and the forums want their license / royalty fees. It’s why enterprise machines don’t use HDMI really and it’s generally 2x DP. That $0.15 per unit/etc adds up.
If you look at the pcb part, it doesn't have the notecs that a normal DP port has, hints to why it just fits ;) There is a part of the spec in DP that the pcb needs to 90 degree angals on the pcb end. But when removing those two, it will fit a HDMI port
I would assume it would be to deter people from connecting HDMIs to DP ports and expecting them to work, as people plug cables into whatever fits, since it's very logical. If it fits, since there are multiple connections in different sizes, it most likely is correct
It's pretty easy to tell the difference... The piece in the middle doesn't have a little notch on the left side the regular DP ports have... At least that's what it looks like
It wouldnt need to be labeled if it worked with passive adapters because displayport is entirely backwards compatible to hdmi. Passive adapters will work with all dp ports
They don't do it because (at least at the moment) it is more effort and more costly. It's not too surprising they can doit though; With higher tolerances and some detection (likely based on the more basic pins like power rails) you can probably just switch between one signaling and another. There's probably a physical switch/detector there so it knows when to redo the detection too.
one difference between the XI3 port i noticed immidiately is that the card-edge isn't a U shape like it is on the other computer, thus allowing HDMI to fit it, without that, DisplayPort connectors *MIGHT* just fit HDMI ones fine, but this is spectulation at this point. i don't know for sure.
This would be so useful on some monitors, because some (like mine) only have display port connectors, if my Xbox could be plugged in to the computer using a hdmi that would be so awesome (also dp is better than hdmi anyway)
My guess is that because HDMI is a proprietary standard, there may be licensing issues with creating an HDMI port that isn't specifically how HDMI wants it to be.
I believe I heard that on some videos about the HDMI forum not letting AMD put HDMI 2.1+ support on Linux open source drivers. kinda unrelated but if the HDMI entity doesn't like your HDMI thing, you can't put HDMI logos on anything related to your HDMI product thing
@@hohohodrigues yeah thats why some ali stuff has multiple hdmi out ports cuz they def aint paying license fees
Yeah especially when you dare to make a Linux driver with HDMI support
hdmi port is royalty free. the signals, communication and protocols using that port might be licensed
Yeah you need to pay license fees for HDMI. You don't need that for displayport so its a lot easier to add to a system. Fun fact; the main reason HDMI is used for TVs is because it supports DRM. The only problem is that its implementation is hilariously bad. You can remove the DRM by just using a cheap Chinese HDMI splitter. Company I worked for used to sell devices that took HDMI as input but didn't support the DRM features. So the easy fix was just chucking a splitter inbetween. Industrial equipment, didn't need DRM so it wasn't an issue to their customers.
This just in Xi3 is now banned from the HDMI working group
They got their ass 😭😭😭
FK HDMI, worst bs standard ever...
@@pawkitz USB-C is far, far worse and it destroys equipment.
@SmallSpoonBrigade Aside from silly rumours, do you have any proof to back up that outlandish claim?
I have used dozens of USB C cables over the years, and not once did I ever have equipment die because of it.
The only issue I can see is buying cheap USB C cables from untrustworthy suppliers, but that's usually a terrible idea for electronics in general.
@@StardustLegacyFighternot sure if it's a joke but it seems like something the HDMI forum would do. AMD cards can't output a high speed HDMI signal on Linux because the HDMI group wouldn't let them release an open source driver that implements the new HDMI spec
No. This is illegal. I'm calling the cops.
HDMI Forum already called their lawyers
*dials monitor* HDMI, what you OLED?
Cry harder bigrig
Nah, we need the FBI or Interpol.
there's a part of the Displayport specification called DP++ that allows the DisplayPort to detect an hdmi display and output a native hdmi signal.
Whoever designed this device was clearly a fan of this sort of thing, because it also includes a whole bunch of those wacky dual eSATA/USB ports too. Honestly surprised it doesn't have some of those combo 3.5mm/SPDIF ports that put the optical transceiver in the back of the analog jack.
My Samsung TV has this thing.
I fact all my TV has is like 7x 3,5mm connectors and it comes with a box full of adapters. It also has 3D. So you can tell it is something like 15 years older. But also has HDMI and supports 12bit (not often seen nowadays) 1080p 60hz
I remember seeing these and asking questions of Xi3 at CES in the early 2010s. They really wanted to fit as much USB on these as they could because they saw a future in which literally everything was plugged in through USB because Bluetooth was still wonky. Bt got a lot better but I still use a powered USB hub to plug in all my years of USB stuff to my 2021 gaming rig.
@@rhettigan bluetooth become lil better but not so much.
Keyboard and mouse polling rate is 125Hz - it possible to make 250/500/1000hz polling rate hid device, but nobody make it. Only PS4/PS5 gamepads has 500hz pollin rate.
Even now BT don't have standart protocol for headset with stereo+mic working at same time. No zerolatency or lossless audio protocol either.
Sadstory
I loved the eSATA+P port on my old Lenovo X1 (no carbon).
eSATAp is an awesome port to connect 2.5" (5V) drives
I might be wrong on this, but DP doesn't require license vs HDMI. Hence it not being common/DP being more common nowadays on PC hardware.
I wonder if the license costs more per port. Cuz on GPUs, they use both HDMI and displayport. It would be cool if all 4 or 5 ports on a GPU were just these, so you could pick either one instead of only having 1x HDMI and 3x displayport or something
@@JonBringusAMD just got shut-down by the HDMI asshats for trying to include part of the HDMI 2.1 spec in their open-source Linux drivers. Intel and maybe Nvidia have gotten around this by simply including a DP-to-HDMI converter on board.
@@HululusLabsI don’t remember where but I swore I read that intel does it that way but Nvidia does it some other way where it’s on the GPU die. Can’t remember where tho so take it with a grain of salt
@@Wileybot2004Intel = built in adapter on the GPU
Nvidia = full proprietary stack so they get greenlit by HDMI forum
@@Yamzeebruh intel is clever
I swear 15 years ago I was selling PCs at a store that had that combo port. It got me in trouble as I was new to it all and then assumed every DP port was HDMI compatible so would tell people to buy whatever cable they wanted for their DP ports "yeah bro, HDMI fits 100% maybe you just have to force it a bit".
Lol If I was your customer I'd just say. Well do you want to test drive that theory and do it 😂
@_SYDGAMING_ why do you think I got in trouble
Ooooof
@@Siktah lol 😂🤣 fair enough man
Based
Displayport has interoperability modes for DVI and HDMI. I didn't think they were pin compatible but the DP to HDMI and DP to DVI-I adapters are passive. The DisplayPort does all the magic. For DVI-A or VGA Displayport has those analog signals as well where HDMI doesn't.
DVI and HDMI are electrically identical. Some gfx cards even have the ability to drive sound through their DVI ports if the other end is detected as a HDMI display
@@buddy19134They can push sounds through the DVI port, but they can't have the DVI logo on them because audio isn't part of the DVI standard.
@@pootispiker2866 to the everyday Joe it doesn't matter if it has a logo or not. Only that it works
@@buddy19134 Oh no I wasn't saying anything like that, only what the implications are.
Another human that actually knows!
These ports should be on everything, they would help send HDMI to the dustbin of history where it belongs.
I swear I can't understand how a static cable can break out of freaking nowhere. HDMI ones do here. Never understood it.
Got my chunky DVI here running plain 1080p 60 Hz beautifully as it should, for years same cable. Got some sad tugs on it and still fine. Look at an HDMI cable with a frowning face and it dies of sadness.
i (perhaps weirdly) always liked HDMI and i bought the cheapest cables haha. what was so bad about it? I am on displayport now, but always thought HDMI was a great port personally.
@@incubrian It's terrible. It's a pain to get RGB out of it, it always wants to do stupid chroma subsampling crap because it sucks for bandwidth. If you were getting away with buying the cheapest cables you must have been running 1080p. Also there's a really obnoxious licensing fee to use HDMI. DP is better in every way.
Just invest in Thunderbolt/USB-C 🫤
@@incubrian ... *how*
I remember buying the cheapest HDMI cable I could find a good few years ago to use with my PC
The amount of issues I had with that cable was insane, dead pixels EVERYWHERE, Incorrect hz support, everything was just, *wrong*
Had to go back and buy a cable that was 50% or double as expensive, just to get something that worked at all
Ever since I got my hands on displayport cables I've never looked back... No issues of any kind whatsoever, only thing I can really complain about would be the locking mechanism, as sometimes on some devices it'll be almost impossible to get at, and on some cables it'll be flimsy or "Soft" making it all the harder to unlock, would still rather have that than a cable that doesn't work at all though :v
We require these on modern graphics cards, so you don't have to worry if the monitor you buy has DisplayPort and HDMI. All you need is an HDMI or DisplayPort cable and you're ready to go. This should become an standard like this is just so smart. Imagine seeing and ad that is like "You have One HDMI and Three DisplayPort connectors on your Graphics card not anymore because this new (Put Graphics card brand and model here) Has four 2 in one port that can accept both DisplayPort and HDMI buy (Put Graphics card brand and model here) now for only (Put pirce here)"
Better if everyone just ditched HDMI, the only reason it still exists is it's owned by TV manufacturers.
To be fair modern graphics cards usually come with two display ports and two hdmi so unless you want a ton of monitors you should be good without any adapters
@@25566usually 3 displayport one HDMI in my experience, at least currently. there was a short window where it was an even split but displayport took over years ago on the PC side, with the single HDMI port being there for compatibility, and to an extent vr headsets.
every so often I'll see displayport+HDMI+DVI on lower end cards but that's the exception, not the rule
@@25566I’ve never seen that configuration always 3 DP and 1 HDMI
@@kaelwdthere isnt any other connector that carries audio right now tho at least to my knowledge
i give you a thumbs up for making this a 2 minute video instead of 20 mins with swooshing title intros and bloaty narration. you are like this port. awesome.
Don't forget about the begging for likes, subscriptions, and even insisting that people use UA-cam's obnoxious notification system. That type of behavior is usually enough to get me to not subscribe in the first place.
the reason this isn't done more often is cuz HDMI sucks ass (the people that run it), they make you pay a license fee for every HDMI port,
also they blocked AMD's opensource HDMI implementation
I heard about that
the port itself is not licensed, but the protocol using that port is.
this is better and would work tho, because there is no HDMI port, it's just a DP with HDMI protocol
@@john_doe668that's still requires licensing to HDMI
@@john_doe668The thing being licensed is the HDMI protocol
DisplayPort Dual-Mode (DP++) is quite common but it requires a passive adapter. This is taking things one step further.
I found what I think it the same connector, from REGO out of Taiwan. They call it "HDMI/Display Port 2 in 1 Connectors"
MANY THANKS! I need to sample these asap :)
It's probably electrically a DP++ port, ie those common DP ports that you can use a cheap passive adapter with to get HDMI. The problem with those is that you generally only get pretty low level HDMI, so 1920x1200@60 max, so not really anything you want outside of common office tasks.
Average person wouldn't probably notice tbh
"pretty low level HDMI" is 1000% on the HDMI consortium. It must be propriety, it may not deviate from any part of anything, they have you by the balls if you signed the NDAs, etc. You CANNOT produce a DisplayPort connector that outputs HDMI at full 2.1 specs. You CANNOT produce code that implements the spec if it's not proprietary (the way Intel and Nvidia do.) If you break the contract you signed … HDMI is basically awful and needs to go away.
This is my guess too.
It just can do the level shifting without an adapter.
@@knghtbrdWell you probably can write such code or make such hardware if you're not in HDMI Consortium. The thing is, you'd have to reverse engineer everything, which is a bit hard.
And then you'd have to survive the lawyers assault, and prove that your reverse engineered solution doesn't use their classified documentation.
You'd also not be able to use HDMI name and symbols (trademark violation), you might need to skirt around some patents, and court fees might still end you, even if you win
tl;dr - you can in China.
@@knghtbrdjust get the NDA signed by a hobo, grab the code, then use a 3rd world company to produce the thing, and flood the market with it. The consortium will be able to find the NDA violator begging next to a station somewhere, but that's all. Their lawyers will be screwed up, and all the specs and sources will be open, products will came to consumers by channels like temu, wish or ali. Other way is to hire an employee fired by them, who knows all about how the newer specs work,.then develop a "for use with" code...
HDMI and DisplayPort are electrically compatible with simple adapters.
Heck, I even have cables that are HDMI and one end and DP on the other.
So while this isn't that mind blowing, it is really cool to see that someone found a way to make a port that accepts both connectors.
this cheap cables contain IC in one of the plugs to do the level shifting work
Going from DP to HDMI is simple like that, but not the other way round.
Not so easy. You need a boost power converter to make 5V from 3.3V. DP only offers 3.3V. HDMI needs 5V
@@markusfischhaber8178 a boost circuit is very simple. You can't do that when going back from HDMI to DP.
HDMI and DVI are electrically compatible (Same kind of signal but using different plug), but DP is NOT using the same kind of signal as DVI/HDMI
The socket in this video have a switch inside that can detect the type of the plug and output the right signal
about a year ago i was working for an automation company that used industrial mini PCs, from a company called DFI. they had these little combo ports and i was just as surprised! only bummer is it didn’t work with certain monitors, never figured out if it was a DFI exclusive problem…good video!
It’s really neat, but these days DP-over-USBC is easier and better than a weird combo jack especially on SFF computers and laptops
it feels like "Thats right, the square hole."
DP is natively compatible to HDMI btw. Any DP port can do HDMI, as long as you give it a passive pin to pin adapter (with a resistor on the adapter detect pin to tell the GPU this port is in HDMI mode)
So what I assume is different about this one is that, besides having shaven off some or the plastics of a DP port to fit an HDMI cable, is that it has a switch that detects if the cable plugged in is slightly smaller (aka an HDMI cable) and automatically does the switching that a passive adapter would normally have to do...
Kinda funky tbh. Tho as others mentioned, DP is an open standard, while HDMI requires licensing fees, so the passive HDMI compatibility on a DP port always depends on what the GPU manufacturer/driver has built in and isn't necessairly linked to the DP spec of the same port... (and some might opt to drop HDMI compat to save costs, in which case an active adapter is needed)
Not so fast. Not ALL DP had that functionality!
The very first or the cheapest DP, needed ACTIVE adapters to convert to HDMI, as the STANDARD did not include those features initially.
@@pepeshopping I did mention that towards the end. The DP spec says that the port *can* carry HDMI directly, but because HDMI is a payed standard and DP is not, they cannot require support for HDMI, so to save costs cheaper devices can opt to only support active adapters
Both are glorified DVI, so I can believe it.
you need to specifically implement the optional dual mode displayport feature for compatibility with passive adapters but at this point that is so ubiquitous manufacturers don't add the logos for it anymore, it is not something displayport has by default and does require extra hardware
actually, type-c to displayport adapters (that use the alt mode, rather than the weird displaylink ones that dont work right except on windows) dont actually provide this, so if you are then daisy-chaining them with a passive displayport to hdmi adapter or cable (aka all the displayport to hdmi cables in existence) they dont work
Hey, you leave Brenda alone.
She's doing her best, alright.
She's had a lot on her plate since her Dad's surgery.
this would be so great for GPU to get 4 HDMI and 4 Display port monitors
WHY DON’T ALL GAMING LAPTOPS HAVE THIS???
If the laptop already has DP, it would be more expensive to implement this as you'd have to pay a fee.
If the laptop had HDMI, implementing this would limit the HDMI to an older version of HDMI. Meaning max 1920*1200@60hz (I'm pretty sure).
Display Port is an open standard and HDMI isn't. My guess is the group/company that controls HDMi and collects money from device manufacturers for any device they sell with HDMI would be very against this. They won't allow AMD to implement the lastest HDMI standards in Linux because they are more worried about keeping the actual spec secret then having something that works. Intel and Nvidia got around this by putting a DP to HDMI chip behind the HDMI port on their gpus/mbs so that they are outputting dp from the gpu and then converting it to HDMI before it gets to the port with the chip. So the only reason this probably exists is the company that made it was small enough that no one noticed what they were doing before products like this were released.
if the laptop use this kind of socket, you cannot plug in two monitors at the same time (one DP monitor and one HDMI monitor)
@@LaugeHeiberg later post-2013 dual mode displayport ports gained HDMI 1.4 support which gets you 120hz 1080p, 60hz 1440p or 2160p at 30hz
@@tonyli1212 You can daisy chain DP monitors
Would allow for DP on TVs. That would be super nice.
I am almost sure the HDMI consortium forbid any of its members to even think about it if they want to have HDMI on any of their future products.
HDMI is a cancer just like Volkswagen. They slow down progress so they can deliver the bare minimum for ages
You can get DP to HDMI cables
This is somehow both cursed and revolutionary
It looks like maybe this sacrifices Pin 20 of the DP connector. This pin only provides power to electronics embedded in the cable itself (e.g. active converters) and is unused (like, not even connected to anything) in passive cables. Maybe all the pins in DP++’s HDMI mode are in the same order as HDMI pins?
the ML lanes are at same location as hdmi's channels, but by the end of the cable the DDC vs AUX and few other signals are swapping positions with grounds - which likely makes the autodetection possible. The pin 20 sacrifice is correct, that bears the power to the DP++ adapter, which is in this configuration embedded into the host board. Few muxes and its done. Very clever indeed! I would say HDMI stole the DVI signalling.. and DP stole the HDMI pin layout.. until they got mad at each other and the notches were introduced to code the DP plug away (like DP-only, not bi-protocol..)
EDIT: looking at the REGO connector linked below by a commenter - they use a mechanical switch for cable type detection :D
@@cameramaker cool! The pinouts on the connector’s page are very interesting.
"A DisplayPort Port That You Can Plug HDMI Into"
Careful, he's a hero
This should be standard!
Would be lovely if someone would combine fiber optic and PoE in this way too xD
as you can see, the pins inside are wider.
And you can detect by a small microcontroller what hot plug pin is active(DP or hdmi) then you can just switch a multiplexer switch.
That ending gave my poor old Thinkpad some war flashbacks, DP got screwed by previous owner because they thought it was HDMI
this unholy sorcery looks great
0:28 thats just displayport it can passively adapt to hdmi and dvi. as can hdmi to dvi, with dvi to hdmi being compatible on ports specifically allowed for it. like it will pretty much send the signal straight over the wire. you can even get audio out of a dvi port in hdmi mode lmao
im pretty sure their running displayport in hdmi mode with this and have this funky ass connector and disabling auto switching turns this feature off
thats why dp is king being able to act as 3 different connectors with cheap adapters and adapter cables
edit: nvm this isnt dp spec but just something very common in anything remotely modern. its something called dp duelmode
I'll correct you a bit about dvi port, because there's 3 types of them, there's the oldest DVI-A,the combined DVI-I, the digital DVI-D and all of them has different types of signals and they have different passive adapters D, I can have passive HDMI/display port adapters, A, I can have VGA passive transformer because these output analogue signal, basically you can have passive adapter if it outputs same type of signal, doesn't matter it's dp,HDMI, vga,Dvi-X or even thunderbolt, because these can be straightly translated to dp or HDMI or dvi.
I don't think the HDMI foundation likes that very much
God, like eSata being a USB as well used to mess me up, but this is nightmare fuel
eSATA does not support USB. eSATAp (sometimes called eSATA + USB or Power over eSATA) does, though.
eSATA in a combo port with USB are simply 2 different standards with their own different pins, that happen to be similar in size so they can be combined in the same space, but each still has their own format and signals.
The Brenda in finance jab was hilarious, we all know it's true. I say this as a computer engineer, it's my job to make of finance/business majors. Well, that and make computers, I guess... but mainly the first thing
This should be on all monitor.
If they make the connector twice as high as this, while one half is HDMI and one half is DP, it would be good as well. The main point is that if you use one type of connection, you would not able to use the other one, so both connectors are connected to the "same" monitor output, but you could freely decide if you want to use DP or HDMI.
Put this on monitors and TVs as well to make it more convenient (If I happen to use a TV as a monitor I would use DP instead of HDMI. Especially with more than one TV, as the video card has only one HDMI out)
It's nice for people that dont understand display port or hdmi and accidentally plug in dp or hdmi into the socket. We need to see more of this
I want this on a GPU
DisplayPort: **exists**
HDMI: *im bouta end this man’s whole career.*
After thinking about it: It makes sense to differentiate the two connectors, als people tend to try plug in anything that seems to match (even USB into ethernet RJ45...) and wonder, when it doesn't work, just because they don't understand certain differences.
At the moment, USB-C falls into the trap of one universal plug type, that could do pretty much anything from simple charging, analog audio, USB2, USB 3.x, HDMI and Thunderbolt 3/USB 4 ... though: the point is: there's a dozen of subtle symbols barely recognizable on the devices and unknown to most customers, and on the other hand: in the stores you find a million USB-C cables, 90% being stupid outdated "highspeed 480MBit/sec USB2 charger" cables but no hints on the cable itself, what it can do, and why some (TB3/4, 100W etc. are 10x more expensive than the funky colorful ones) ... I mean ... WTF?
The people behind USB did a perfect job to multilateral confusion.
Displayport cables do exactly what their second ends (big DP, miniDP, HDMI) do look like, without confusion of misunderstandings, while USB-C implies non-existing compatibility and continuous questions.
That's one of the reasons I dislike USB-C. Although it *can* be great for interoperability, it's poorly executed and leaves most people confused. A colourcoding system or some system with symbols should be implemented to differentiate between the different types (just like how "normal" USB was colourcoded).
@@lajawi. Exactly. Color codes for supported protocols (USB2/3/TB)/speed + data/charge mode + clear, distinguishable symbols on the plugs AND the cable + maybe LEDs on the cable or the plug, which mode has been negotiated. Also, no current OS offers some info dialog, which shows probably suboptimal, restricted cabling (or hubs) after plugging in another device.
Another feature they could have added: a button for "I want to eject this device" + general driver/OS function.
The 21. century didn't quite reach all developers, so far. They had all options, but failed again.
I miss Firewire, which did just plain work.
actuly usb-c only suports dp but dp can have pasive hdmi addaptors
:points at thunderbolt 1/2 in confusion:
@@jnzooger TB1 never made it into the wild. The early prototypes were called Lightpeak and featured optical cables, before switching to copper transceivers.
This is surprisingly convenient. Almost all the displays I work with use hdmi and displayport to hdmi adapters are too big for handheld devices.
Brenda from Financial! You sir, hit the nail on the head!
That port is a masterpiece bro
Fun fact, for a long time as a teenager I used to think you could just plug HDMI into DP because the school computers had these cursed ports in them, and all of the TVs and monitors in my house still used component and VGA.
The HDMI council is sweating balls now.
the funny thing is the only thing they did was break the nub off. DP ports and HDMI ports are both little different from plugging a mouse and a flash drive into the same port. it doesn't care what the label is because its a digital output that says what the device wants. hence why HDMI/DP adapters will output what the monitor expects no matter which way it goes. despite the wrong plug being used
its not even a wierd wiring job, its basically just the same cable slightly modified
As someone who's never heard of a display port before, this video seems kinda funny haha. If I see something that says it's a HDMI port, the first thing I'm gonna do is stick a HDMI cable in it
I share your amazement sir. I love this to the maximum. MAKE THIS STANDARD!!!!
This computer was awesome but kinda obsolete when it came out. They are works of art to me for what they tried to be. So it is super cool that the engineer still talks about this.
I've heard that hdmi and display port are the same protocall. Monitors don't usually need sound but honestly it would be nice if I could connect speakers to a monitor and use the hdmi cable to put the computer further away
Okay that’s genuinely mind blowing
This would be so good for graphics cards because i could have one port be both without an adapter
Incoming HDMI Forum lawsuit faster than Nintendo be like:
gaming consoles should do this
I think i may be the only person who was fully aware this was coming and in not surprised at all... first time i saw DP port i thought - i bet they will do exactly this.
If you noticed the regular DP port has two rails on the outer end of the copper while HDMI does not, im sure there is funky switching going on internally for the pins to correctly work but one could surmise that if you lobbed the pins of a regular DP port an HDMI plug would fit... just not work and i think thats what makes this interesting really.
Imagine being in an hdmi port discord server. “Hold on babe let me just say bye to my hdmi friends”
After reviewing its datasheet, it appears that the entire pin configuration will be altered based on the condition of pin 21 (presumably the metal tab on the right side of the plug). If pin 21 is open, the HDMI pin configuration should be utilized, and vice versa. Essentially, it is two ports sharing the same connector. In other words, you can actually have 2 ports (1 DP and 1 HDMI) instead of 1 combo port like this, which might be the explanation for why it didn't become popular.
TL;DR, DP 𝘢𝘯𝘥 HDMI will become DP 𝘰𝘳 HDMI if you want this funky connector.
The BIOS option shown in the video is also a hint to two different sets of circuitry.
Well, the technologies are very similar which you can tell from the existence of simple converters/cables.
However, the industries that support each technologies are different.
Computer industry supports DP while TV/entertainment industry supports HDMI.
They were not exactly the same in terms of business and price models.
However, they've recently crossed the roads and more discussions are on the way.
Holy shit... It actually works. Thank you for showing us this! I still don't think I fully understand how, but I wish this became the standard! Did he mention any limitations?
Worth noting that all DP tp hdmi adapters are passive. DP natively supports hdmi standard output.
I always thought it was weird that the connector on DP was keyed, when the housing was already keyed. I guess now we know why.
THIS! This should be the standard on all video cards!
Meanwhile the eSATA/USB combo ports: "Bruh, we've been doing that since 2005!"
Probably HDMI Forum nonsense, and considering that this is a prototype and not an actual product; It makes sense.
Okay, HDMI and DP being pin compatible is news even to me. Putting aside the keying in the standard DP, they are also electrically incompatible BUT I know that DP was designed with HDMI compatibility in mind, your usual GPUs out there can switch from DP signalling to HDMI signalling, given any device is connected via a passive DP to HDMI cable (or DP to DVI, since DVI and HDMI are even eletrically compatible, since HDMI is just a buffed version of DVI with a different physical connector)
These combo ports have been around for a while. I first saw one a few years ago on a KVM at work.
fyi, display port fits hdmi generally anyhow. The size of the connectors is a lot closer than most people think. It's loose unless you do what the xi3 did of course, but the big thing is.. display port is keyed, remove the keys and it's actually straight up compatible like you just found out!
collectively refuse HDMI monopoly
I also blown away by Sony Erricson k850i that support both microSD and m2 card, best childhood phone
I think that this should be a thing for more computers, mainly laptop and small top computers
Considering the HDMI forum's block on AMD's ability to offer better refresh rates for higher resolutions, this SHOULD be more common just so DisplayPort can be more accessible for everyone.
DP has a key at both ends of the connector strip. HDMI has none. That's the only reason a HDMI cable won't plugin to a regular DP. Signaling however is another matter.
DP is signal compatible with HDMI, that's why there's passive adapters for DP-HDMI conversion
@@mjetektman9313 DP video cards detect and switch to DVI with HDMI protocol when you plug-in a HDMI cable. They are definitely not compatible signal wise. There's no such thing as a passive adaptor; it only appears passive because of the expensive switching circuitry at the DP jack .
I bet it's down to the logo licensing. DP people won't sign off as meeting the spec if the socket doesn't meet the spec, and HDMI folk will say the same. We may see this in niche applications but if DP/HDMI allows anyone to go off-spec or just carry on anyway then sooner or later manufacturers will stop paying for certification, corners will be cut and the world will end in a puff of magic smoke.
This is absolutely how it should have been all along
Basically a WHY ARE WE NOT FUNDING THIS moment
Well, that's something new...
Today I learned.
"Brenda from financial"
Sides
Orbit
Well considering the kerfuffle that AMD cards are having with the HDMI standards forum (or what ever they are called) where they just WILL NOT let them have full up to date HDMI compatibility on Linux. I am assuming that it is their fault and it wont happen any time soon.
This is like a USB-C to HDMI adapter in the form of a DisplayPort with no microchip required!
HDMI Forum:
(high pitched demonic screech)
These combo ports are on a lot of devices, they're just not always obvious
I recently discovered that VGA is still a thing, and it isn't limited to 800x600 but actually goes even higher than my current monitor resolution!
Welp. So much for sleep tonight. I'm now questioning my experience in the IT workforce......THANKS!
They don’t do it because HDMI and the forums want their license / royalty fees. It’s why enterprise machines don’t use HDMI really and it’s generally 2x DP. That $0.15 per unit/etc adds up.
HDMI is also basically worse or functionally equivalent in every way over Display Port.
Fight me.
That's right! It goes in the square hole!
If you look at the pcb part, it doesn't have the notecs that a normal DP port has, hints to why it just fits ;)
There is a part of the spec in DP that the pcb needs to 90 degree angals on the pcb end. But when removing those two, it will fit a HDMI port
Time to harvest these cube pcs for that combo port
I would assume it would be to deter people from connecting HDMIs to DP ports and expecting them to work, as people plug cables into whatever fits,
since it's very logical. If it fits, since there are multiple connections in different sizes, it most likely is correct
Crazy that the Nintendo GameCube had this kind of technology
display port can natively pass HDMI through it, thats why passive DP -> HDMI cables exist
It's pretty easy to tell the difference... The piece in the middle doesn't have a little notch on the left side the regular DP ports have... At least that's what it looks like
You deserve a subscriber! Kudos.
Mostly one of the companies would throw a fit saying people would pay less for our licensing.
100% it's HDMI license fees and license terms. Like how AMD was stopped recently from releasing open source implementation for a HDMI 2.1 output.
Reminds me of the PowerBook 3400c, it had a unique RJ-11(dialup) and rj-45(ethernet) combo port that literally no other device had
It wouldnt need to be labeled if it worked with passive adapters because displayport is entirely backwards compatible to hdmi. Passive adapters will work with all dp ports
They don't do it because (at least at the moment) it is more effort and more costly. It's not too surprising they can doit though; With higher tolerances and some detection (likely based on the more basic pins like power rails) you can probably just switch between one signaling and another. There's probably a physical switch/detector there so it knows when to redo the detection too.
another thing to blow your mind: there are usb-c ports with displayport in 'em
We haven't been doing this because the HDMI forum, the group that controls HDMI and decides who can do what, charges a lot for this.
one difference between the XI3 port i noticed immidiately is that the card-edge isn't a U shape like it is on the other computer, thus allowing HDMI to fit it, without that, DisplayPort connectors *MIGHT* just fit HDMI ones fine, but this is spectulation at this point. i don't know for sure.
Do you have a dremel? Don't leave us hanging go find out!
This would be so useful on some monitors, because some (like mine) only have display port connectors, if my Xbox could be plugged in to the computer using a hdmi that would be so awesome (also dp is better than hdmi anyway)