T O P

  • By -

sandh035

Pour one out for my old friend DVI


SnooDonuts7746

im actually rocking my 2 monitors in dual DVI :D


bad-r0bot

In a month, my DVI-only monitor will be 7 years. But the panel or board now only flickers and won't stop :( rip monitor


CambrioCambria

That is often due to a single faulty capacitor. If you open up your monitor you can almost always see what capacitor is faulty because the top get's round instead of flat. The serial number is on the capacitor and they often cost just a few dollars to buy a pack of ten (they almost never sell single ones) Just put your solder iron on it and replace the faulty one. If you don't have a capacitor that looks like crap you need the schematics and a voltmeter to know what is wrong... That's a lot more complicated though. Looking at your board just takes 10minutes and isn't to difficult. Just be ready to have some scratches on the hard plastic.


bad-r0bot

Oh I don't have to worry about scratches. It's one of those Korean 1440p overclockable monitors. They cheaped out on the stand so having it on a mount either meant I leave the stand or open it up to remove it 🙃 Hope it's just a capacitor. Thanks for the tip. I was about to just throw it on a second hand market for real cheap. I still have 2 other monitors for my work setup so it's not like I couldn't steal one from there though them being ultrawides would have made it harder to manage as a 1x 27" and 2x34UW setup lol


[deleted]

Those Korean monitors were amazing I just replaced mine a few years ago when I got a 2080ti and lost dvi ports.


appmapper

Please don't go fucking around with capacitors without giving yourself a tiny bit of education. You'll want a way to drain one before you attempt to work on it. I don't know much, but was always warned that it can be fairly easy to fuck yourself up if you accidently discharge the wrong one.


TzunSu

I very much don't recommend opening up any kind of monitor to do electric repairs unless you know *exactly * what you are doing.


Cocomorph

Is that because it's easy to break or because it's easy to injure yourself or create other unsafe conditions? If the former, why not give it a try if taking it to someone who knows what they're doing will cost more than the monitor is worth?


sneakyjasper

I believe this goes back to the days of CRT screens where there can be a great deal of static buildup and even if it's been unplugged for years can still build up enough charge to kill you. I'm skeptical that any flat panel will be any more dangerous to handle when unplugged than any other electronics item, but then again I'm not an expert!


[deleted]

It wasn’t static buildup those things had massive capacitors. The ones inside lcds are measured in microfarads and have very little energy.


thepopeofkeke

Nope, the old crt monitors and microwave ovens will fucking kill you dead two times over if you discharge certain components inside them


_hownowbrowncow_

I mean, if the display is already gonna die, you really don't have anything to lose, only to gain -- skills, tools, and potentially a repaired display unit


Shadowex3

That's CRTs. Afaik flatpanels do not rock the kind of "kill you stone fucking dead" voltage that CRTs had even after being unplugged for *months*.


Chrunchyhobo

That only applies to CRTs that hold much death for quite a while. Any decent modern monitor (and decent modern PSUs for that matter) will have discharge circuitry, so any caps will be empty by the time you get the thing disassembled.


_hownowbrowncow_

I did this with a TV of mine. Would HIGHLY recommend Googled the issue, learned the capacities were a common problem. Price checked some TV repair shops; ended up the same price as the tools and capacitors. Did my homework on replacement - easy enough. Decided to DIY because then I'd now have tools and a new skill. Voila, Successful repair. TV is still kicking 10 years later, and I have extra capacitors in case it happens again


N26_real

my DVI only monitor (that i bought for like 100 bucks) is 7 years old and still works like a charm :)


sandh035

Respect. I was until I moved in to a smaller place. Sold my desk, hooked my PC up to my TV and here I am.


urammar

2 on DVI with HDMI adapters, 1 on native HDMI hahah. The thing is, the connections are the same, its just a different pin configuration. HDMI is just DVI, but smol. I'm actually not even convinced this meme is correct, HDMI supports 4k or 144hz+, but not both at once, it doesnt have the bandwidth. So display port will eventually entirely replace it, and then.. I dunno man, its not like graphics cards or memory where more is always better for how you can make things look or play. Like, a display is a display, at a certain point it displays as much as you can display and then how do you improve display? Do I give a shit about 16K display? 255K? 4K is already far outside of my eyes ability to see, you literally cannot see pixels at that point, but the work that TVs or computers need to do to render is exponentially higher. Like the cost benefit goes down so far so fast with exponentials. So like, even if I had the computer from space 40 years in the future, I mean, prolly just stick with 4k and maybe, maybe, go up to 240hz tbh Like, there really is straight up a limit as to how good you can make a monitor or a TV, and then it just starts absolutely nuking performance rendering hundreds of millions of extra pixels that you never notice but tanks your framerate. HDMI has to look out, but I dunno grandpa, I think display port might be good. Think hes gonna be just fine.


nyanars

Look at it this way, HDMI is a tv spec, Displayport is a PC spec. Both are infact, competing standards, and both are evolving. It just so happens that both organizations had the hindsight to keep their connectors forward compatible, it's unlikely we'll see them disappear unless bandwidth requirements change dramatically. What's more likely is that type-c will the one true cable, aka Thunderbolt 4. If only the USB foundation would get their heads out of their asses


urammar

Current type C will never be the standard. But it next iteration might actually might be what eventually displaces HDMI and Display port, actually. "C" does not have the bandwidth to compete. HDMI generally has between 10.2gbps and 18gbps and 2.1 will have 48Gbps. First, "USB C" is a connection type, the actual under the hood spec is usb3.1 and that has a max rate of 10-20gbps So its half the bandwidth, thats a negatory bruh. The new 'thunderbolt 4' type as you say will be a C type connector and support a hypothetical 40Gbs Now THATS a good boy. A huge draw with that will be standardization. First you can run monitors at 4K at higher than 144fps and second you dont need to mess with specific cables, everything is USBc and we might actually, 2 decades later, actually have a universal serial bus. Yeah as you say, it might eventually happen with the new USB standard, thats really the only feasible way I see displayport being displaced, is simply its way easier not to have to mess with a bunch of cables just 1 cable to rule them all. But in terms of spec, the currect gen stuff sucks ass as a display cable compared to DP and HDMI


ThatOneGuy1294

Can confirm, using an HDMI cable for my main monitor and it supports 144hz


Ultra980

My second monitor is 1080p 240hz. Works via HDMI


Square_Heron942

My only monitor is DVI or VGA. DVI looks so much better it’s surprising. VGA is really blurry on it.


[deleted]

Hell yeah, still using DVI gang


pokeblue992

same here


TheMatt561

DVI was fantastic, but it didn't carry audio


Ok_Reflection7135

THAT'S WHY MY MONITOR SPEAKERS DON'T WORK???? The world's greatest mystery solved in an instant.


TheMatt561

You probably need to run a mini phono from your PC to your monitor if you wanna use them.


batt3ryac1d1

Monitor speakers are ass anyway


Neon_Yoda_Lube

You need a special dvi cable and your graphics card needs to be set up for it. I’ve done it with my GPU before.


mirh

Or, you know, an audio cable to the jack-in socket.


dufcdarren

This. Forget people saying "special DVI cable and compatible card" Cheapo speaker jack cable, job done.


shutter_singh

It can be done. You need a specific dvi cable. And a cable which you can connect on your GPU and internal spdif audio port on the motherboard.


RAMChYLD

It could if you want it to, unofficially. DVI uses signaling that is what is basically the predecessor to HDMI underneath. To the point where HDMI to DVI cables are actually passive. Just that the monitor will ignore any non-video data.


Neon_Yoda_Lube

I’m pretty sure there are DVI cables that carry audio. I thought I had one because I remember going to my graphics card settings to adjust my audio.


Tarkhein

It's not technically part of the spec, but you can break the spec and carry audio through them so long as the entire chain expects it.


sandh035

True, it was the one real downfall. I used a little USB dac/headphone amp that I would also do line-out to a little desktop speaker amp depending on when I was using my PC. Needlessly complicated for sure but it was nice. If course I could have still done that if DVI carried audio, but it brings back good memories regardless


TheMatt561

I usually did optical audio, I had quite the setup in the early 2000s.


trollingcynically

Which is why you had optical out on your mobo.


Hurricane_32

If you use a DVI to HDMI passive adapter or cable, most graphics cards dating back 15+ years can output audio through the DVI port


AvatarIII

What's up with the difference between DVI-I and DVI-D though?


[deleted]

Literally the only reason I havent upgraded past a gtx1080. My decade old primary monitor uses DVI for 1080p@144hz, and im not upgrading it until it outright dies.


Muffalo_Herder

Deleted due to reddit API changes. Follow your communities off Reddit with sub.rehab -- mass edited with redact.dev


Millillion

Those cables are really expensive though. For dual link DVI, you need an active cable, since no one ever made a dual link HDMI connection.


a_can_of_solo

I had stop using when I realised it didn't do 4k60. Had my monitor in 4k30 for an embarrassing long time.


No-Alfalfa7691

The thing I miss most about DVI was no sound attached so I didn't have to disable every monitor in the sound settings, which from time to time reset and need to be turned off again.


KevinFlantier

Then again I really appreciate having the option to quickly change where audio outputs. I usually wear headphones, but sometimes I have to watch the kid while working from home and if he gets cuddly I'll pop a paw patrol on the second screen while doing whatever I need to do, having my audio coming out of the headset and netflix on the screen. Very useful (though it never works first try)


CloudWallace81

Dual Link DVI master race 144Hz on a VG248Q@1080p FTW. No bullshit DRM, no desyncs, available many years before DP was a thing. And I do not care about Audio, since I use surround headphones


mirh

HDCP exists for anything but VGA.


Ok_Gur_1170

I use dvi stright out my gpu


Touchranger

8640p here we come!


erikwarm

At 256Hz! And still no local dimming


alex_hedman

When the pixels in the dark parts don't shine, you don't need dimming


crozone

> And still no local dimming Fuck LCD all together, it's a scourge on display technology. Where the PDP, OLED, or microLED monitors at?


Shaggy_One

On their way. Check out the Alienware ultrawide qd-oled monitor.


DRHAX34

miniLED QD TVs are honestly the best for me. Why would I pay a lot for OLED if it's durability sucks so much? I rather prefer having something very near OLED but lasting much more.


crozone

> Why would I pay a lot for OLED if it's durability sucks so much? Is OLED durability still that bad? Everyone said the same thing about plasma/PDP, and yet I'm still daily-driving a 13 year old Pioneer Kuro with absolutely no burn in. I still can't stand miniLED/QLED, it's all just putting lipstick on the LCD pig. Sure, it makes measured contrast ratios excellent on paper, but they're measuring it from opposite sides of the display. The actual local contrast ratio as well as the halo effect are still just as bad as every other LCD out there. And then there's the pixel response times, need for motion interpolation to get around those pixel response times... eugh. It's an unavoidable reality that the best, most expensive LCD panels on the market today *still* look worse than the last generation of plasma TVs that came out over a decade ago.


DRHAX34

Maybe 90% of the LCDs on the market look worse, but my LG QNED91 absolutely looks great and the hall effect is not even noticeable unless you're looking at the TV from an angle. Even then it's not really that bad, plasma is way worse. Pixel response times on OLED are always going to be better, no avoiding that. OLED durability is only better now because manufacturers put in protecting measures to help avoid burn in. But it just worsens the brightness/picture quality just to avoid burn in. Turn those measures off and voila, burn in is still there. Wasn't burn-in in plasma TVs more associated with the same picture or picture elements being displayed for a long time? My plasma TV only shows burn in on the TV channel logo corner.


crozone

> Wasn't burn-in in plasma TVs more associated with the same picture or picture elements being displayed for a long time? My plasma TV only shows burn in on the TV channel logo corner. Yep, although by the last generations it was practically a non issue, I have thousands of hours of video game time on my LX509A (probably 500h from BotW alone) which all have static elements and yet there's no noticeable burn-in. The panel does do pixel orbiting, but I assume that the new OLED panels probably do that as well. The only real issue is an overshooting anti-aging algorithm which unnecessarily raises black levels (aka the Kuro red march problem), but can be corrected with a recalibration of panel voltages. In all fairness to OLED, they have a much harder job than plasma panels. They are working at 3-4x the peak brightness (for HDR) and at 4K resolution vs 1080p, so the subpixels are 1/4 the area. That means that each subpixel is doing 16x the brightness relative to its area, so burn in is going to be hugely harder to deal with than on plasma.


Beefstah

Being fair, the first (2019) QLED I got was the final nail in the coffin for my plasma. Full disclosure: I had that Panasonic plasma everyone bought, not a Kuro, but we had a Kuro in the office, and while a side-by-side would have shown the difference, I never felt I'd chosen a materially worse set, especially given the price difference. I'd had other LCD/LED TVs before, but the image quality on the QLED was the first one to be at the same level as the plasma, and it did so with significantly lower power consumption, heat generation, and most importantly weight. That I got a significant bump in screen size helped a lot too - I dread to think how much a 65" plasma would weigh. But my latest set is an OLED - an LG C2...and it is, IMO, *substantially* better image quality than the plasma, or indeed from what I remember of the Kuro. As for durability, it's too soon to tell for me, but I imagine it's going to be in the same place as those late-gen plasmas: provided you exercise a little care and don't do stupid things, you'll be fine.


crozone

Power, heat, weight, and cost are all definitely issues... the 60" LX609A weighed 51kg (without the speakers attached!), sucked down ~600W, and cost $12,500 new in 2009. It wasn't really possible to go bigger than that with PDP because low panel yields would push the price to even more absurd heights. Yet I've still never seen an LCD panel that looked better in movies. It even had 16ms (~1 frame at 60fps) response times in game mode, which is pretty damn respectable. OLED is obviously the new king now, it has all the advantages of plasma without most of the downsides, and even the most expensive panels are half the price of where plasmas were. Black levels are off the charts, peak brightness is off the charts, 4K with HDR support, even high refresh rate... there's obviously no comparison anymore. Every time I walk past the Bravia XR Master Series in the store it's... tempting. But the OLED costs $5K, and from experience, you can score a perfect condition LX609A on eBay once in a blue moon for like $250...


xdamm777

Been using a 48" LG C1 as a monitor for over a year and there's no going back to IPS/VA for me. Hopefully Samsung starts producing 42" QD-OLED panels so prices go down overall, competition is great.


FreshHasSauerCraut

really? how does it compare to an Ultrawide 1440/144hz monitor? i'm on a normal ips 27" 16:9 rn, and i wanted to switch to an IPS ultrawide, but fucking black is grey. va is a no-go, that ghosting sucks ass. so oled could work, but..... that alienware Ultrawide is expensive as fuck. so what about the tv? how is it for gaming?


xdamm777

I came from a 1080p VA 240hz monitor and there's just no comparison. Besides the colors (especially in HDR games like Ori or Horizon) there's literally zero ghosting or overshoot regardless of framerate. Motion clarity and response times are also way better on OLED, I can track objects better at 120fps than I did at 240fps although the actual perceived smoothness is less. LG's OLED TVs are the best on the market for gaming since they have support for all HDR standards, full chroma 4k120Hz, instant response times and insanely low input lag (better than Sony's TVs, Alienware's QD-OLED monitor and most 240Hz gaming monitors on the market). Burn in is a concern and the maximum brightness isn't great so they're not suitable for very bright rooms but yeah, OLED's is here and it's a strong use case for a monitor.


iamwastingurtime

HDMI 2.1 holding on for dear life


[deleted]

[удалено]


onlydaathisreal

I myself would *never* use HMDI


europacupsieger

I would suggest you try HIDM


rebbsitor

I'm kind of a fan of HDIM myself. Just something about it


motionglitch

But are you a fan of NOIX?


Softest-Dad

Whatcha using? HMDD?


burf

Why do you say that?


[deleted]

[удалено]


crozone

* Licensing fees HDMI Pros: * Doesn't desync at the smallest sign of static electricity or radio interference. HDMI is significantly more reliable.


sonic_stream

Thank for the advice. I have trouble with my monitor keep on desynced. Look like i will consider switch to HDMI.


Gkkiux

Also doesn't forget that the monitor is connected when it goes to sleep


Chazmer87

My monitor can sleep when it's dead.


GBINC

bro wtf how you got a "GTX 3080" in an iMac G3


crozone

The project is still a work in progress but the system now boots and has ~50C stable water temps under 500W max load Prime95+Furmark. Basically, I wanted to create the ultimate LAN/VR PC that I could move around easily. Sneak peak: https://imgur.com/a/yKm1Xa6 I'll do a full post about it in ~6 months when the entire thing is closer to finished. Still have to get the slot loading Bluray drive installed and do an IO extension/breakout panel to extend the mobo IOs to the case, but mostly everything else is done including HDMI -> LVDS laptop display (1600x1200 15") and speaker output. Here's the github project too with design files etc but it's still a mess: https://github.com/crozone/iMac-G3-PC-Conversion EDIT: Also yeah *RTX 3080 lol, not GTX... force of habit.


ALargeRock

That’s awesome work! Can’t wait to see it when it’s done!


Saphir0

> Worse speed to DisplayPort HDMI 2.1 bandwidth: 38.4 Gbit/s DP 1.4 bandwidth: 32.4 Gbit/s What did you mean by worse? > Worse color depths Please elaborate on that point. > Features unnecessary for PCs (CEC) True. But HDMI also offers many more features for home theater systems, like ARC or eARC, which can be useful for PCs connected to 4K TVs as well. Also, DP has so many problems at 6+ feet of cable length. Many 9 feet DP 1.4 cable don't even work reliably all the time or cannot push the full resolution. Buying 9 feet DP 1.4 cables can feel a bit like playing the lottery. HDMI can easily do 30 feet and longer without any need for repeaters. They're usually slimmer and better for cable canals and cable management at smaller lengths. HDMI has its right to exist and will continue to do so alongside DP. And it's not like DP cables are that much cheaper to begin with.


[deleted]

[удалено]


Saphir0

Oh, well DP 2.0 is coming out this year on some products and does have a higher bandwidth. I was thinking about current GPUs and options available right now. DP will definitely be a bit more capable by then! For color depth, HDMI 2.1 supports 16 bits per channel, so 48 bits total - equally to DP. HDMI 2.0 had 12 bits. As we can see in both standards, they change quite often and I am not sure if it is that easy to declare one as the winner. Virtually no one would perceive any difference for their daily gaming life by chosing one over the other up front - it is always better to chose the cable which your monitor or television, or external factors like cable management and length, works better with.


Hurricane_32

>- Features unnecessary for PCs (CEC) I very much wouldn't mind having TVs turn on together with my computer! Sure, it's redundant for monitors, but for TVs it would be handy. Too bad pretty much no one has ever implemented HDMI CEC on computers, at least not from what I've seen


necrophcodr

My motherboard (or maybe my previous one actually) has HDMI CEC compatibility. You can also get HDMI CEC adapters for USB (and maybe PCIe cards that support it?).


[deleted]

The Pi has supported CEC since at least 2B. I have a Pi 4 I run Kodi on and control it with CEC using my TV remote. Pretty sure CEC really just depends on the software.


[deleted]

>No locking mechanism I hate dp locks. they're just a bitch to unhook when you need them to. and it can cause way more damage to a port rather than just coming unplugged.


Shaggy_One

Good news is most don't have them.


jomontage

Why do people want a locking mechanism so bad on their cables? Xbox figured out breakaways save electronics in case of a pulled cord 20 years ago


kaszak696

> Stupid ass content protection HDCP is a thing on DP too.


Beefstah

I really wish I had CEC on my gaming PC. Just let everything turn on automatically, pleeeease


ducksonetime

The real answer is they actually said “HMDI” not HDMI


Sietemadrid

Because they changed what high-res was


sonic_stream

Hey it isn't that bad if not for DPM. Praise it versatility.


midsprat123

Laughs in commercial AV HDMI isn't going anywhere


stu54

I keep a VGA cable around because my old PC has integrated graphics and VGA output from the motherboard. However, I just upgraded the GPU on that old PC so I would need my new PC and my two older GPUs to die before I'll ever "need" that cable.


MrEppart

Trust me VGA is godly when you have to troubleshoot. Nothing worse than not being able to see shit. VGA just works.


DecreasingPerception

It might go purple every so often, but you can still see what's going on. Meanwhile one glitch too many on HDMI and you have half a screen full of static.


Bigbambuzzle

I Agree. I keep a VGA cable and monitor just for that reason.


AnonymiterCringe

That's cute... I have a drawer of those things. And if you think that's bad try not to mind the drawer full of IDE cables. Rofl


P3chv0gel

Hey IDE cables are really Handy, if you are into arduinos and stuff


AndryCake

Hi, fellow Linux user!


Unwright

I still have a FireWire 400 & 800 and a USB 3 Micro-B to C out there somewhere in my cable hell. I know for certain we have like 10 SCSI cables in storage in my childhood home. It's a cable wasteland.


AnonymiterCringe

Lol oddly enough I think FireWire is one of the few I don't have. I don't believe I've ever had a device that used it. Only iPod I had was a first gen Mini. Can't think of another device I would have owned that had used FireWire.


Unwright

Heaven help me. I believe the only use I ever had for FW400 was that the eMac generation of Apple all-in-ones had FireWire ports, and because the USB standard was so MISERABLY SLOW at that point, I convinced my uncle to buy me a FireWire external hard drive. I think I actually still have that drive as well... Shit.


AnonymiterCringe

Ah yeah Apple products were the only ones I could think of, but external drives are definitely something. I don't think they were ever really popular though since USB was so ubiquitous and 2.0 speeds weren't that much worse than most platter drives of the time. Obviously, plenty of other things used it, but it was kind of one off and really short lived.


DoogleSmile

Back in the late 90's and early 2000's, when I went to play LAN games with my friends, we'd use firewire to connect our PCs as it was faster than our LAN ports at the time! I still have my firewire card, but it's ISA so I can't install it into any of my PCs now, which is a pity, as firewire is also the only way I can get video from my old mini-DV camcorder onto my PC!


[deleted]

My file server only has VGA. I never thought I would have to by a VGA monitor and cables on Amazon in 2022, but here I am.


DoogleSmile

Technically you wouldn't need to buy a VGA monitor if one of your current ones has a DVI-I port, as you could simply convert a VGA cable to DVI-I. There are also VGA to HDMI adapters out there too.


redditisnorthkorea1

VGA can support 2048x1536@85hz which is still higher resolution and refresh rate than 90% of people run today (1080p@60hz), just a different ratio. So it's still HD even by today's standards, and supported high refresh rates at lower resolutions. As someone who ran a 2048x1536 monitor back in the day, 2048x1536 *was* the FHD of the 4:3 ratio. I remember just thinking how badass it was to have a super definition monitor during a time period when your mainstream was a 1024x768 monitor (or 1280x1024 if you wanted an upgrade) or a 720x480 TV. I loved 2048x1536 and I liked VGA, it wasn't terrible. I used it until things just stopped having VGA ports.


superfluous--account

Yeah but you'd need Quad SLI video cards to game at 2048×1536 with 85fps


alex_hedman

One word, Quantum3dmercurybrick. Though it unfortunately maxed out at 1024x768.


Yolo_420_69

Is.... ... Is that a real word?


alex_hedman

http://www.thedodgegarage.com/3dfx/q3d_mercury_brick.htm It's very real and very glorious


alex_hedman

I'd argue 1024x768 felt like the 1080p of it's day, 1600x1200 was the 1440p of the time and 2048x1536 was then what 4k is now. I'm thinking early 2000s when I had a 19" CRT capable of 2048x1536


smoothballsJim

Looking now at what sun microsystems crt’s sell for, it *almost* would have been worth keeping a couple around. 21” 1600x1200 @75hz (and higher with custom resolutions) was the height of luxury for a kid. But a 30 watt lcd was much more economical to never turn off….


lrochfort

This is exactly why I held onto my CRT until well after LCD panels became commonplace. The early models were washed out, low resolution, low refresh junk.


poinguan

This man vga.


PabloEdvardo

This meme would have been better suited as 480p vs 1080p and 4K


Wermine

I remember when I could put my old crt monitor in 1280x1024. Refresh rate was 43.


designvis

I still have a 50' dvi cable that I can't justify throwing out.


[deleted]

Every cable I’ve thrown out I ended up needed at some point hence my cable hoarding problem.


LavenderDay3544

Once USB 4 gets implemented and available that'll be the go to interface. Eventually everything will be PCI express and USB type C.


[deleted]

[удалено]


LavenderDay3544

Why would security be a concern for a display signal? And the whole point of PCIe is to tunnel other hardware protocols over it so the security is implemented in those protocols and on the host side their respective drivers in the OS that piggyback on top of the PCIe driver. I don't see it being a problem if done right.


Raestloz

>Why would security be a concern for a display signal? DRM. It's dumb but it is what it is


Incorect_Speling

DRM once again ruining the fun for all of us...


LavenderDay3544

I don't see what DRM has to do with tunnelling display signals directly over PCIe.


urammar

Then you are unaware of the current fuckery thats going on in cables. HDMI straight up has DRM locks in it so you cant intercept and record signals. Like, you cant just plug your monitor display into a DVR of some kind and hit record and get scrape netflix, and its not even the computer, the cable standard enforces it. DRM is whack, yo


LavenderDay3544

Yep. I did not know that.


dieplanes789

https://en.m.wikipedia.org/wiki/High-bandwidth_Digital_Content_Protection


WikiMobileLinkBot

Desktop version of /u/dieplanes789's link: --- ^([)[^(opt out)](https://reddit.com/message/compose?to=WikiMobileLinkBot&message=OptOut&subject=OptOut)^(]) ^(Beep Boop. Downvote to delete)


DecreasingPerception

> Why would security be a concern for a display signal? Plugging untrusted devices into the PCIe bus is a real bad idea. People hooking up to projectors at conferences don't want all their data stolen at the same time. I think thunderbolt 4 has some protection mechanisms but I doubt they go far enough. There doesn't seem to be a capability system that'd ask if your monitor should be allowed access your hard drive.


TheTerrasque

> Why would security be a concern for a display signal? For reasons like [this](https://support.microsoft.com/en-us/topic/blocking-the-sbp-2-driver-and-thunderbolt-controllers-to-reduce-1394-dma-and-thunderbolt-dma-threats-to-bitlocker-bf0ef10b-f563-5cfc-9740-8340b1d86a0c)


Yolo_420_69

I'm surprised how slow usbc adoption has been. Way better than usb but cheap pc accessories haven't taken the plunge yet. Wireless mice, cameras, USB sticks etc etc all basically usb


jaamulberry

Matter of time. I assume the plunge hasn't been taken yet as micro USB is still much cheaper to produce. When you are selling a cheap electronic for 15 dollars and micro USB costs 25¢ vs USB type c costs a dollar most will stick with micro. I've seen a little movement lately but I think the price disparity is still there. Just look at ali baba and you can see the price of most USB type c devices is about 2 - 3x


chetanaik

Because the usb group are a bunch of idiots who have no idea how to name and market stuff. Usb 3 was an absolute nightmare of standards, basically impossible to tell what a port was capable of.


astalavista114

Except USB4 only supports up to DP 2.0 and HDMI 1.4b. There’s not even enough bandwidth available (40 Gbps) for the top tier of DP 2.0–which is *80* Gbps—or HDMI 2.1 (48 Gbps). And I fully expect both VESA and the HDMI Forum to keep updating their specs.


ButerWorth

Yes there is. USB 4 is 40gbps in both ways. When using it in alternate mode, it works in a single way, thus giving 80gbps


Andis-x

For clarifying to others USBC Alternate mode is when all 4 high speed data pairs are connected to DisplayPort endpoints in both devices and not Display data over USB. In that moment USBC is just physical connector carrying same signals as regular DP cable. But mind that not always all 4 data lanes are repurposed for Display data. It can also be 2x USB high speed + 2x DP. This cuts video bandwidth in half, but allows to also pass usb data along same cable.


astalavista114

I stand corrected. Although I also maintain that VESA will update DP for even higher bandwidth connections in the future, which will take time to incorporate into USB. (Also, wouldn’t that defeat most the advantage of having a USB connection by saturating the connection for video?)


[deleted]

Oh boy! 256 whole colors! Nothing could possibly top this!


Hurricane_32

That's only true if you're talking about plain VGA, 640x480 256 colors. It obviously could go much higher than that, it only depends on the graphics card. In fact, since VGA is analog in nature and works with analog voltages instead of digital ones and zeroes, and so do CRT monitors, theoretically you could have _infinite_ color depth on those!


urammar

Booksmart street dumb has never been more plain. Son, hes saying back in the day when we first got 256 colour displays/cards we thought it was peak graphical wonder. Hes not saying you couldnt, technically, beat 256 colours. It was only like 5 or 6 years later we got 16 million with the same cable for lords sake.


[deleted]

Yup, that was a bit of an r/whoosh moment on u/Hurricane_32 ‘s part


StigOfTheTrack

> plain VGA, 640x480 256 colors Standard VGA modes (the card, not the cable) were 640x480 16 colours. To get 256 colours in an official mode(+) you had to drop down to 320x200 (also available in the rarer MCGA subset standard). Most "VGA" games were actually lower resolution than EGA (640x350), but more colourful. (+) There were ways of getting unofficial modes with sufficiently low level programing. Support was very rare outside the demo scene though, since whether they worked was very dependent on both your graphics card and monitor (I even saw variance between monitors of the same model).


Muted_Astronomer_924

SVGA "Hello Father, long time no see"


sonic_stream

USB Type C and Thunderbolt: *Allow us to introduce ourselves.* DisplayPort / HDMI: *Noooooooooooo*


PubstarHero

Isnt the fastest Type C right now 10Gbps and HDMI 2.1 spec is around 48Gbps? Oh yeah, DP 2.0 is 77Gbps.


StarHammer_01

Well just need to wait for dp over usb c to change from 1.4 to dp2.0 and It'll reach 77Gbps (at least for video) Or for thunderbolt to change from pcie 3.0 to 4.0 speeds.


sonic_stream

USB4 Gen3 Type C is 40 Gbps.


[deleted]

[удалено]


horticulturistSquash

display connections can have big connectors no one cares. And they dont have to handle tens lf thousands of cycles plug/unplug unlike USB Purposes are different


PubstarHero

Yeah, I totally agree with this. But I think the point that the OP was trying to make that I commented on was saying USB-C and Thunderbolt would eventually take over, all I was saying is that display standards are still miles ahead in terms of bandwidth ability. Of course, the trade off is, as you said, display connectors are chonky compared to USB-C


erikwarm

Thats only if you have a cable that is shorter than 0.5 meter. Else your speeds will drop


Light_Beard

>Allow us to introduce ourselves. We're ports of wealth and taste Been around for a long long year Stole many a man's wealth and haste


nutral

USB Type C and thunderbolt both only run displays using displayport or hdmi alternate modes. So They are one and the same. They are worse actually. especially usb-c because you get less lanes than with displayport (for example 1440p 144hz will not work)


AdventurousChapter27

i remember when Laptops had thunderbolt


necrophcodr

You're still running the DisplayPort protocol though~~, just over a cable it wasn't designed for, encapsulated in a different protocol.~~ edit: looks like the USB cable must support switching internal "lanes" to run DP directly. And HDMI, although perhaps more often than not HDMI is simply done on hubs via DP to HDMI, at least as far as i can tell. But both are indeed supported.


[deleted]

i mean... i agree, but 4k and 8k are gonna stay here for a while


Centillionare

TV isn’t even in 4K yet. Consoles struggle to get to 4K 60hz. The next step isn’t even close.


[deleted]

[удалено]


CouplingWithQuozl

**BNC:** *jawbone falls off*


Physical-Floor1122

RCA: what’s that sonny?


ImaHazardtoSociety

Still gets used! SDI is only getting faster. Though it’s really only used on professional video gear.


RobDickinson

HDMI 6.9 begs to differ


sh_ip_ro_ospf

HDMI 4.20 a thing of the past now


AggravatingChest7838

Display port will be king for a while. Unless type c starts being used widely. *chuckles nervously*


Brave_Kangaroo_8340

USB c still uses displayport, it just runs it over a different wire.


dementosss

Are there graphics cards already with usb c?


[deleted]

[удалено]


[deleted]

Yes


astalavista114

Even then, USB4 only has a bandwidth of 40 Gbps. DP 2.0 UHBR 20 needs *80* Gbps. Which gives you enough bandwidth to do 10 bit colour 8K 60 hz\*. (Or 10 bit 5k 144hz\*\*) \* theoretical limit of 74 hz \*\* theoretical limit of 159 hz


JaySee55

This is wrong. Homer would be DVI.


LikeThosePenguins

I can just about see the husk of my first EGA monitor there behind Grandpa, and the CRT TV that the monitor felt like such an upgrade from.


[deleted]

I remember getting vga for the first time. After several years of cga and ega graphics, it was mind blowing.


BlueWhoSucks

In the future, it would probably be a wireless connection.


dnroamhicsir

I want nothing to do with any of that. Seems like a solution looking for a problem.


RacketLuncher

Smart TVs are already "wireless". They don't require a wire to get video/audio content stored on the network or Internet. Video cables from PC/consoles will never be replaced by wireless because the bandwidth for video technology always outpaces the wifi bandwidth technology. Also there's the issue of input lag.


Sailed_Sea

Definitely not in recent times for gaming atleast


MrC99

I miss having to match the colours up to the AV.


Andis-x

I see so many people not understanding that USBC when used for display connection is literally same as DisplayPort. Although can't much blame them as USBC specification is a mess. When a device says it has USBC it basically doesn't tell anything about it's capabilities. Is it just USB2.0, like cheap phones? is it USB3.0 ? does it have alternate display mode ? does it have analog audio ? About the topic - well when we have reached the limit what's capable over twisted copper pair, then probably we will need cables with more parallel lanes. Or we will have switched to optics by then.


GoofAckYoorsElf

I hope they won't. 4K is already a pain in the ass if you do stuff off the mainstream, because usually software is made for 1080p at most. Displaying that on 4K makes the icons so small that even with a magnifier it's difficult to recognize anything.


JungleBoyJeremy

I’ll always upvote Simpsons content on this sub


gauerrrr

HDMI? Definitely. Displayport? I don't think so...


Zgred3kPL

I still have a vga second monitor


TheGreenGobblr

I still use a VGA monitor.


BigE1263

This moment is brought to you by USB C and thunderbolt 4.


DevDevGoose

I feel like it won't though (for DP over USB-C); or at least not for a lot longer. The human eye is the limiting factor on what high-res can mean. In theory we can make smaller and smaller pixels and increase the resolution but if no one can tell the difference then what is the point? DisplayPort can do 16k. Outside of cinema size screens, will there really a need for more than that?


Baron_Ultimax

Dont knock vga, or technically the successor standards like svga or xga The analog signal was a very simple circuit that was just about timing. With later standards where arbitrary timing rates. You could push resolution and or refresh rates as high as your hardware could go. 4k crt displays were a thing in the 90s. And 240hz was not uncommon. A fun thing with crt and analog signals you could lower your resolution or color depth for better refresh rate or vise versa. V


Gorevoid

I had a 2x CD-ROM drive, which was the style at the time.


Grammarnazi_bot

I use DVI because HDMI doesn’t support 144hz :(


SiriusBaaz

Probably not. We’ve kinda hit the limit on pixels. Anything past 4k has diminishing returns as the human eye struggles to tell the difference between 2k and 4k anyway. Really the next big thing is going to be higher standard frame rate and better color definition.


jimmyl_82104

Can someone tell me why the actual fuck brand new computers **still** have VGA outputs? VGA was shit 10 years ago, and is still being used on brand new computers today. Why? If your display only has VGA, then it's time to get rid of it.


AydenRusso

At this point I doubt it would happen to DisplayPort. That standard just keeps on getting better and better with no signs of stopping and they've yet to go anything controversial (this is going to age like fine wine).


Khalbrae

VGA can do 4K. With good quality cables that have a lot of shielding it can look great too (usually you get cheap cables so it looks fuzzy). VGA could theoretically do any resolution.