I don’t get why they couldn’t just do more 8-pin connectors. Three is common enough on things like 6900XTs. Four would give you a 600w envelope too.
Then you don’t have to redesign the PSU and they’re a known quantity. It’s not like this lets you make the card meaningfully smaller, heatsimk and all.
The 8 pin connectors are very bulky. So a new connector is definitely needed if they are gonna keep increasing the power demands, but this one ain’t it.
This connector is possibly ok. But really they should have put two on instead of running one at its limits.
I haven’t followed the recent failures but the earlier failures were iirc caused by improper .
The connector should be redesigned to make something like that impossible. For example the connector shouldn’t click if it’s not fully inserted and the sense pins should be shorter as not to deliver current until full insertion.
IIRC it’s also pretty damn fragile and slight stress on the connector can start fires.
It’s becoming harder and harder to argue against people stating idiocracy was a documentary…
Why use such shitty connectors? There are so much better options for this. Like the xt90 which can support 90amps.
Because it doesn’t look like all the other cables and it’s not proprietary. Why not reinvent the wheel while we have the chance?
/s
Those would cost .04 cents more! Think of the shareholders!
I went with AMD for my most recent build out of pure knee-jerk spite and that’s looking like a better and better decision with each passing day.
I’m not going to say I called it, but I called it. I watched the 50xx series launch with only vague interest since I have no intention of buying an nVidia card again, ever. I saw that tiny little power connector kicked off at a weird angle like that on the 5080 in particular and thought, “Boy, that looks stupid as fuck. I wonder how many minutes it’ll be before those start catching on fire.” The 40xx’s getup was already a well known issue as we can see by the gymnastics apparently required as illustrated in the article. It seems that they learned nothing other than that consumers will buy the damn things anyway. And neither did we, since people are still clamoring to buy them.
Mankind has known for a hundred years how to make an electrical connector big enough to handle the amount of current it’s supposed to pass. And it’s not like they didn’t have any goddamned room on the vast acreage of the 40xx and 50xx series cards to work with, either.
You know what? I haven’t had a single problem with my RX7900XTX except that I wound up with an Asus one, and I need to use Asus’ stupid Armory Crate doohickey to mess with its single RGB light as it does not appear to be compatible with iCue or OpenRGB. But you know what, if the thing is stuck in its default Unicorn Barf mode until an update or two comes down the pipeline and it hasn’t, like, set my computer on fire I think I can deal with that.
5000 series has largely been subpar IMO. In a competitive market they would have been forced to keep the 4000 series in production, but we don’t have a competitive market for GPUs.
Yeah, I saw the Gamers Nexus benchmarks on it and for the money I’m really not impressed. With what these cost, a couple of percentage points don’t excite me. I already have a plenty fast graphics card.
The only thing I’m “missing out” on is nVidia’s attempted near-monopoly on raytracing, which is not a technology in which I’m the slightest bit interested because, gee-whiz factor aside, even on their very fastest flagship new card it still tanks your framerate to an unacceptable level (in my opinion) for no tangible benefit whatsoever.
The only issue I foresee is upcoming games that “need” RTX, i.e. the current incarnation of the id Tech engine for some batshit reason, but the only things that run that so far are Doom: Dark Ages and that Indiana Jones game which I likewise have no interest in. So fuck it.
(And I similarly do not care about DLSS or FSR or motion interpolation or any other kinds of fake frames, which are another absolute dumb-shit dead end.)
DLSS and FSR is actually really good 🤷♂️ free frames for very little quality degradation, seriously can make a massive difference in playability.
The frames they generate are not “free,” nor are they necessarily accurate. If your GPU can’t fill your screen at native resolution at a rate matching or exceeding the refresh rate of your display, you either need to turn down your graphics settings or invest in a beefier GPU, not make up fake image data to go in between (introducing input lag) or around (by deliberately rendering at suboptimal resolution and attempting to AI upscale the result). And attempting to exceed your display’s refresh rate by making up additional fake frames is literally pointless, just setting electricity on fire for no benefit.
But then, it will probably also shock and horrify people to learn that I also always run with full screen antialiasing turned off. My display is 3840x2160. Trust me, jaggies are of no concern.
I have no interest in either of these stupid technologies.
Well DLSS and FSR are the AI up scaling, whereas frame generation is a different thing, that one I haven’t had great luck with, but DLSS has absolutely been very helpful to me getting better performance for minimal quality loss.
I’m mainly doing 60hz at 1440p and can run things fine for the most part, but graphically intensive games can hitch a little sometimes without it, or for example in Skyrim or Fallout 4, it means the difference between playability on a very large modlist or not.
I dont really need a lecture on how performance is impacted by hardware and such, I’m quite aware and just have a different opinion and experience with these technologies.
What I will say, is I think developers are relying on it too heavily and not properly optimizing their games, but that’s not really a new phenomenon, just the latest shortcut lol
For the benefit of anyone else reading this, nVidia’s DLSS3 and DLSS4 absolutely do incorporate motion interpolation (i.e. fake frames) via various methods. Fake frame generation can be disabled, at least for now, but that’s really not the point. What’s more to the point is that the only headline capability added to this with the 50 series nVidia cards, for instance, is an even greater depth of fake frame generation. nVidia clearly thinks that their future is in fake frames.
DLSS Super Resolution is the image upscaling scheme, and is now a component of DLSS3/4, but claiming that the current incarnation of DLSS is not intended to generate frames from the whole cloth is inaccurate. nVidia labeling both of these things “DLSS” probably did not do any favors to anyone’s ability to keep track of this ongoing clusterfuck. If you have a 30 series card or below you are limited to upscaling, but upscaling is not the main thing I’m griping about.
(This is also now the case with both AMD’s FSR 3.1 and 2.0, also, which explicitly mention “temporal upscaling,” i.e. once again fake frames, in their blurbs.)
If upscaling in whatever form looks better for you, mind you that I’m not trashing your opinion. To some degree, options exist for a reason. Some motherfuckers play their emulators with Supereagle scaling enabled, or whatever. I dunno, it takes all kinds. But silicon space that your card’s maker dedicated to AI and upscaling fuckery is also silicon that could have just been allocated to bigger or more rendering pipelines as well, and that’s exactly what they didn’t do.
But towards your last point, absolutely yes. This is also how raytracting and RTX are being pitched now that the cat is out of the bag that RTX performance is generally trash and it also achieves very little in terms of adding usable gameplay-conveying visual information. “Oh, but now instead of calculating light maps in advance developers can just have it performed in not-quite-real-time on the GPU [wasting a shitload of silicon and electricity calculating this over and over again when it could have been done just once at the studio]! It’s so much easier!!!”
This is deeply stupid. Miss me with all that shit.
It seems we’ve reached the plateau, finally, where the hardware vendors (or at least nVidia) can’t or won’t bring any new meaningful performance enhancements to the table for whatever reason, so in order to keep the perpetual upgrade treadmill going they’re resorting to bolting gimcrack crap to the hardware to help them cheat instead. Maybe some day actual per-pixel real time raytracing will be viable and for certain applications that could indeed be rad, but trying to force it halfassed now is pretty counterproductive. Ditto with frame generation. I’m sure it has applications in noninteractive media or video rendering, but trying to shoehorn it into gaming doesn’t make any sense.
You still need to keep an eye on your GPU’s power cable.
Great, now I have to stick a webcam in my computer case.
Talk about addressing the symptom and not the problem… 😬
A cooling fan for a power connector!