RAMs and SSDs died for this

  • Conselheiro@lemmygrad.ml
    link
    fedilink
    arrow-up
    17
    ·
    3 days ago

    Apparently it takes a whole separate high end GPU just to run this filter. The lipstick on the female character is the icing in the cake, it’s like the AI version of those “Alloy with Make Up” Horizon mods.

  • ghost_of_faso3@lemmygrad.ml
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    4
    ·
    3 days ago

    I think it kinda looks cool actually, the hogwarts one was the most impressive to me. Will say though 0 chance il ever be able to run this without investing like 10k+ into top line NVIDIA bullshit.

  • CriticalResist8@lemmygrad.ml
    link
    fedilink
    arrow-up
    7
    arrow-down
    7
    ·
    3 days ago

    I expect it’ll look different by the time it rolls around in 6 months. I don’t get why the internet is so up in arms about this except that it’s just more AI kneejerk reaction. Like yeah it’s not “the best” but it’s also a cheap way to get more mileage out of your hardware - DLSS and AMD FX are already in video games and they do get you more FPS, though you need an RTX and it looks terrible in some games.

    The examples they showed might not be the best but I think during gameplay you won’t notice it, and I expect devs will be able to tweak it to their game so that it retains the art direction they want to convey. I have to assume the examples were filtered by Nvidia themselves and they did whatever they felt was best, I could definitely see losing some atmosphere in the resident evil clip they showed, but it’s a very short clip too.

    • loathsome dongeater@lemmygrad.mlOPM
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 day ago

      I am extremely skeptical about the purported flexibility that is implied in the pinned youtube comment seeing how there is none in the showcase. Even if this aesthetic is assumed to be desirable, all things considered photorealism is a statistically niche art style that is extremely expensive to implement. Most games just don’t go for it. Here are the top 20 PC games from last year according to Metacritic. There are at most five games here that would benefit from a filter like this and I am being extremely generous here:

      #1: Hades II
      #2: Blue Prince
      #3: Clair Obscur: Expedition 33*
      #4: Hollow Knight: Silksong
      #5: Split Fiction
      #6: Final Fantasy VII Rebirth
      #7: The Last of Us Part II Remastered*
      #8: Trails in the Sky 1st Chapter
      #9: despelote
      #10: The Talos Principle: Reawakened*
      #11: The Seance of Blake Manor
      #12: Kingdom Come: Deliverance II*
      #13: Monster Hunter Wilds*
      #14: Dragon Quest I & II HD-2D Remake
      #15: FINAL FANTASY TACTICS - The Ivalice Chronicles
      #16: Monster Train 2
      #17: Pipistrello and the Cursed Yoyo
      #18: Lumines Arise
      #19: Dispatch
      #20: Bionic Bay
      

      Blasting up resolution and frame rates with AI on the other hand are desirable all across the board. This feature is not comparable to them.

      • CriticalResist8@lemmygrad.ml
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        24 hours ago

        Okay, I ended up doing a bunch of research and writing and rewriting this comment a few times lol but I think we got to the bottom of it.

        In the final analysis, I think what dlss 5 shows is that nvidia is betting on moving away from ‘traditional’ GPUs to tensor-architecture processing units, made especially for running AI models.

        So what they would have is instead of rendering the ray-tracing, subsurface scattering, hair physics etc directly on the GPU all at once, they would have an AI model running on a TPU (Tensor Processing Unit) to render it on the frame/geometry. This would give some breathing room back in computing power, if it pans out.

        This would cement their status as a monopoly or near-monopoly in TPUs, but it would also bypass the bottleneck of current tech that’s not scalable indefinitely. The 5090 is already pushing manufacturing capabilities. The new dlss does help performance, especially on TPUs if they go that way, but even on the GPU, compared to the same ‘native’ options.

        This could work, but it’s very early work. So how this will pan out in practice is still anyone’s guess, it’s too early to be sure that we’ll just have to settle for TPUs and ‘slop’.

        Keep in mind nvidia is already the leader in GPUs, and games are made to their specifications for their hardware (for the most part). They’re the ones who released physX, and then shelved it. They’re the ones who make ray tracing and HDR that people can’t run, and this is tech we already have. So I don’t necessarily see the move to TPUs, if it even ends up happening, as wholly different to what we’ve been living with for 20+ years.

        In my opinion this showcase was more of a developers’ demo, though it seems I was right that nvidia engineers used the model on the games without going through the devs - the artists that worked on some of those games were surprised that their game was in the video. Engineers used an aggressive setting and capable devs could instead use it sparingly or the way they want it for their art direction. And others won’t care and just press the ‘enable everything’ button.

        However, the fact remains that most people won’t be able to run it. They have announced that dlss 5 will initially run on a single 5090, which is just out of price and even if it was affordable, not everyone could use it. So they’re sowing the seeds now, knowing that devs will end up using the SDK and thus get ‘locked’ into using nvidia - like they’ve been doing for the past 20 years of course.

        If this pans out for nvidia hardware requirements will go down, and along with it the model will get expanded to allow for more usecases, like LORAs or fine-tuning on the devs’ part to get it to look the way they want. In general with AI it’s the same situation as Photoshop in its time - people don’t know how to approach it at first and think it’s taking away their intent, then they get comfortable with it and find ways they can still show intent even with different tools. Companies started making digital drawing surfaces etc. It’ll be similar here, but in the final analysis what we see is capitalism doing its monopoly thing. I know it’s a duh moment lol, but it’s interesting seeing it play out perfectly from just a showcase video.

        What will likely end up happening in the short-term is studios will use ‘captured in-game (*with dlss 5)’ disclaimers in their trailers, and they will include it in the game, but just like motion blur it’s something people won’t turn on, not that they could for at least a good few years lol. From my research I found that graphics are a big selling point, even when people won’t be able to run the game at max settings. Of course we knew graphics sell, but it’s interesting that it doesn’t seem to matter if the customer will be able to run the graphics - they just like that it looks good, even if they know that they won’t get these graphics out of it.

        tl;dr: new paradigm shift that shakes up the market lenin-pointing . but yeah the big takeaway is a complete shift from GPUs to TPUs, with all that entails.

    • PostingInternational@lemmygrad.ml
      link
      fedilink
      arrow-up
      12
      ·
      3 days ago

      Nothing “cheap” about this mileage.

      They were using two(!) RTX 5090s, literally having one of them just for AI rendering.

      Furthermore, these AI (let’s call them) “improvements” come at a cost of destroying the consumer PC market with skyrocketing prices. The very market that is supposed to be using this.

      • CriticalResist8@lemmygrad.ml
        link
        fedilink
        arrow-up
        2
        arrow-down
        6
        ·
        2 days ago

        I have to assume this was for the showcase video, DLSS offers options and devs can tailor it to their game - if they use it correctly hopefully. It’s way easy to just say “unreal engine will take care of it just ship” nowadays which is a problem in software in general but not entirely nvidia’s.

        It’s not a great showcase video but having had DLSS on some games, it does make a pretty good difference. I tried it with a game I own just now and I get from 40 FPS on basic antialiasing to 50 with DLSS to 90 with frame generation enabled. Quality is just as good as without DLSS, just some flickering in this game on the snow, but most of all it doesn’t make distant objects blurry like ‘traditional’ antialiasing does.

        A lot of it has to do with how it’s implemented, and in the comments nvidia said themselves that devs will be able to tailor it and choose how the filter applies and where. I believe you that they used two 5090s for this showcase but I have to assume it’s because they tried to get the most out of it and didn’t even seem to get the game devs involved in deciding on the filter. We also won’t have to turn it on necessarily and use it.

        It’s not a great showcase video lol I agree though, it just raises a lot more questions. And it asks bigger questions, like why we ‘need’ games to look as photorealistic as possible when they are not movies nor real life and elements in the scene communicate things to the player, because it’s a game. But stuff like frame generation does have a positive impact and it works great, so I really can’t find any problem with having this available in games now.

        To the point of GPUs being out of price (and out of stock), at the core it has to do with project stargate and the half trillion dollars US tech companies are receiving from it. OpenAI bought up 40% of the world’s supply of wafers, which are a pre-component in memory (gpus, ram, ssds etc for consumers) not because they needed them but because they didn’t want the competition to have them. They don’t even do anything with wafers, they need the working memory.

        nvidia participates in this scheme of course, they’ve announced a shift away from consumer GPUs, so it’s fair to ask who dlss5 even is for when nobody will be able to use it for years to come, but my broader point to the internet response to this video it’s that it’s not the mystical “AI” pulling the strings from behind the scenes. Like I have no doubt we both agree that components are more than just video games and it’s important to have consumer components, but the gamer reaction is predictably “how will this affect my treats” without ever asking themselves how much their hobby costs in terms of electricity and computing power, but suddenly when the AI buzzword is there they are very concerned about the environment (and about graphics looking bad as if they’ll be forced to turn dlss 5 on for their game, as if the market hasn’t been chasing the dragon of hyperrealistic graphics for decades, and as if the consumers at large don’t base their purchase decision on graphics)