• dual_sport_dork 🐧🗡️@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 day ago

    Yeah, I saw the Gamers Nexus benchmarks on it and for the money I’m really not impressed. With what these cost, a couple of percentage points don’t excite me. I already have a plenty fast graphics card.

    The only thing I’m “missing out” on is nVidia’s attempted near-monopoly on raytracing, which is not a technology in which I’m the slightest bit interested because, gee-whiz factor aside, even on their very fastest flagship new card it still tanks your framerate to an unacceptable level (in my opinion) for no tangible benefit whatsoever.

    The only issue I foresee is upcoming games that “need” RTX, i.e. the current incarnation of the id Tech engine for some batshit reason, but the only things that run that so far are Doom: Dark Ages and that Indiana Jones game which I likewise have no interest in. So fuck it.

    (And I similarly do not care about DLSS or FSR or motion interpolation or any other kinds of fake frames, which are another absolute dumb-shit dead end.)

    • Jakeroxs@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      DLSS and FSR is actually really good 🤷‍♂️ free frames for very little quality degradation, seriously can make a massive difference in playability.

      • dual_sport_dork 🐧🗡️@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        23 hours ago

        The frames they generate are not “free,” nor are they necessarily accurate. If your GPU can’t fill your screen at native resolution at a rate matching or exceeding the refresh rate of your display, you either need to turn down your graphics settings or invest in a beefier GPU, not make up fake image data to go in between (introducing input lag) or around (by deliberately rendering at suboptimal resolution and attempting to AI upscale the result). And attempting to exceed your display’s refresh rate by making up additional fake frames is literally pointless, just setting electricity on fire for no benefit.

        But then, it will probably also shock and horrify people to learn that I also always run with full screen antialiasing turned off. My display is 3840x2160. Trust me, jaggies are of no concern.

        I have no interest in either of these stupid technologies.

        • Jakeroxs@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          23 hours ago

          Well DLSS and FSR are the AI up scaling, whereas frame generation is a different thing, that one I haven’t had great luck with, but DLSS has absolutely been very helpful to me getting better performance for minimal quality loss.

          I’m mainly doing 60hz at 1440p and can run things fine for the most part, but graphically intensive games can hitch a little sometimes without it, or for example in Skyrim or Fallout 4, it means the difference between playability on a very large modlist or not.

          I dont really need a lecture on how performance is impacted by hardware and such, I’m quite aware and just have a different opinion and experience with these technologies.

          What I will say, is I think developers are relying on it too heavily and not properly optimizing their games, but that’s not really a new phenomenon, just the latest shortcut lol

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            21 hours ago

            For the benefit of anyone else reading this, nVidia’s DLSS3 and DLSS4 absolutely do incorporate motion interpolation (i.e. fake frames) via various methods. Fake frame generation can be disabled, at least for now, but that’s really not the point. What’s more to the point is that the only headline capability added to this with the 50 series nVidia cards, for instance, is an even greater depth of fake frame generation. nVidia clearly thinks that their future is in fake frames.

            DLSS Super Resolution is the image upscaling scheme, and is now a component of DLSS3/4, but claiming that the current incarnation of DLSS is not intended to generate frames from the whole cloth is inaccurate. nVidia labeling both of these things “DLSS” probably did not do any favors to anyone’s ability to keep track of this ongoing clusterfuck. If you have a 30 series card or below you are limited to upscaling, but upscaling is not the main thing I’m griping about.

            (This is also now the case with both AMD’s FSR 3.1 and 2.0, also, which explicitly mention “temporal upscaling,” i.e. once again fake frames, in their blurbs.)

            If upscaling in whatever form looks better for you, mind you that I’m not trashing your opinion. To some degree, options exist for a reason. Some motherfuckers play their emulators with Supereagle scaling enabled, or whatever. I dunno, it takes all kinds. But silicon space that your card’s maker dedicated to AI and upscaling fuckery is also silicon that could have just been allocated to bigger or more rendering pipelines as well, and that’s exactly what they didn’t do.

            But towards your last point, absolutely yes. This is also how raytracting and RTX are being pitched now that the cat is out of the bag that RTX performance is generally trash and it also achieves very little in terms of adding usable gameplay-conveying visual information. “Oh, but now instead of calculating light maps in advance developers can just have it performed in not-quite-real-time on the GPU [wasting a shitload of silicon and electricity calculating this over and over again when it could have been done just once at the studio]! It’s so much easier!!!”

            This is deeply stupid. Miss me with all that shit.

            It seems we’ve reached the plateau, finally, where the hardware vendors (or at least nVidia) can’t or won’t bring any new meaningful performance enhancements to the table for whatever reason, so in order to keep the perpetual upgrade treadmill going they’re resorting to bolting gimcrack crap to the hardware to help them cheat instead. Maybe some day actual per-pixel real time raytracing will be viable and for certain applications that could indeed be rad, but trying to force it halfassed now is pretty counterproductive. Ditto with frame generation. I’m sure it has applications in noninteractive media or video rendering, but trying to shoehorn it into gaming doesn’t make any sense.