Windows 11 often requires new hardware. But that will be extremely pricey or have very little RAM for a while.

I dont believe that a single competent person works at Micro$oft anymore, but maybe maybe this could lead them to make a less shitty OS?

And garbage software like Adobe Creative Cloud too?

They obviously dont care about users, but the pain could become too big.

  • mycodesucks@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 hour ago

    It’s a really nice idea, but bad developers are already so deep in the sunk cost fallacy that they’ll likely just double down.

    Nobody reassesses their dogma just because the justification for it is no longer valid. That’s not how people work.

  • kboos1@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    4 hours ago

    The “shortage” is temporary and artificial, so that’s a hard NO. The ram shortage doesn’t present any incentive to make apps more efficient because the hardware and software that is already in people’s homes won’t be effected by the shortage and people who currently use the software won’t be affected by the shortage. The very small percentage of people that will be affected by the temporary shortage wouldn’t justify making changes to software that is currently in development.

    There’s no incentive for software companies to make their code more efficient until people stop using their software so stop using it and it will get better. Just as an example Adobe reader is crap, just straight up garbage, but people still use it so the app stopped getting improvements many years ago. Then Adobe moved to a subscription based system, and cloud service for selling your data but guess what, it’s still the same app that it was 10 years ago, just more expensive.

    • SkyNTP@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 hour ago

      What crystal ball told you this was temporary? Every day for the past few years the consumer market moves further and further into serving only the wealthy. The people in power don’t care about selling RAM or other scraps to peasants.

      • 4am@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        54 minutes ago

        Downvoted by libs with their collective heads in the sand.

        It might not wind up working, but Altman and Nadella et. al are trying to push all consumers to forever rent compute from them.

        They do not want you to be able to run your own Deepseek at home. They do not want you to control the hub of your smarthome. They want to know what’s in the spreadsheet you saved, what’s in the business plan you typed up, and when the password is to any E2EE service you have an account with.

        They want to forecast you like the weather.

  • CMDR_Horn@lemmy.world
    link
    fedilink
    arrow-up
    112
    ·
    7 hours ago

    Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      3 hours ago

      Big AI is a bubble but AI in general is not.

      If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.

      I suspect that as more software gets AI-assisted development we’ll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).

      I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.

      In theory, if AI code assist becomes more mature and formalized, the “optimize this” step will likely be built-in, rather than something the developer has to ask for after the fact.

  • mushroommunk@lemmy.today
    link
    fedilink
    arrow-up
    71
    ·
    edit-2
    7 hours ago

    It’s not just garbage software. So many programs are just electron apps which is about the most inefficient way of making them. If we could start actually making programs again instead of just shipping a webpage and a browser bundled together you’d see resource usage plummet.

    In the gaming space even before the RAM shortage I’ve seen more developers begin doing optimization work again thanks to the prevalence of steam deck and such so the precedent is there and I’m hopeful other developers do start considering lower end hardware.

    • Suburbanl3g3nd@lemmings.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      5 hours ago

      Probably a super unpopular take, but the Switch and Switch 2 have done more for game optimization than the Steam Deck has by sheer volume of consoles sold than the Steam Deck ever could. I agree the Steam Deck pushed things further but the catalyst is the Switch/2

      • CountVon@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 hour ago

        So the developers of PC games like Claire Obscure: Expedition 33, which doesn’t have a Switch version of any kinda, spent time, effort and money to optimize specifically for the Steam Deck… because of the Switch’s market share? Cmon now bud, that’s a straight up ridiculous take.

      • XeroxCool@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        5 hours ago

        I take it the Switch/S2 has many non-Nintendo games shared with other consoles? Hard to search through 4,000 titles on Wikipedia to find them at random, but I did see they had one Assassin’s Creed (Odyssey) at the game’s launch. I never really had Nintendo systems and just associate them with exclusive Nintendo games.

        I’m choosing to believe the Steam Machine will do more of the same for PC games. Maybe it won’t force optimization at launch, but I hope it maintains itself as a benchmark for builds and provides demand for optimization to a certain spec.

        • FoxyFerengi@startrek.website
          link
          fedilink
          arrow-up
          2
          ·
          3 hours ago

          I only own one Nintendo game on my Switch. I’m not going to sit here and pretend most of my games run great on it though. Slay the Spire and Stardew run well. But I’ve had quite a few crashes with Civilization and some hangs with Hades or Hollow Knight too

        • mushroommunk@lemmy.today
          link
          fedilink
          arrow-up
          2
          ·
          3 hours ago

          I try to follow the gaming space and I didn’t really see anyone talk about optimization until the Steam deck grew. I do wish more companies were open about their development process so we actually had some data. The switch/switch 2 very well could have pushed it, but I think with those consoles people just accept that they might not get all the full modern AAA games, they’re getting Pokemon and Mario and such. Where as the steam deck they want everything in their steam library. I dunno

          I have no real data, just what I’ve seen people discussing.

    • Brkdncr@lemmy.world
      link
      fedilink
      arrow-up
      7
      arrow-down
      3
      ·
      5 hours ago

      Web apps are a godsend and probably the most important innovation to help move people off of Windows.

      I would prefer improvements to web apps and electron/webview2 if I had to pick.

      • bufalo1973@piefed.social
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 hours ago

        If those web apps were using the same shared electron backend then they could be “a godsend”. But each of those web apps uses it’s own electron backend.

        • Brkdncr@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          5 hours ago

          The beauty of it is that it electron/webview2 will probably get improved and you don’t need to fix the apps.

          • bufalo1973@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            I don’t disagree with that. But the problem is having one electron backend for each web app and not one backend for all web apps.

    • PeriodicallyPedantic@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      5 hours ago

      Idk, I don’t think the issue is election apps using 100mb instead of 10mb. The kind of apps that you write as html/js are almost always inherently low demand, so even 10x-ing their resources doesn’t really cause a problem, since you’re not typically doing other things at the same time.

      The issue is the kind of apps that require huge system resources inherently (like graphically intensive games or research tools), or services that run in the background (because you’ll have a lot of them running at the same time).

      • mushroommunk@lemmy.today
        link
        fedilink
        arrow-up
        2
        ·
        3 hours ago

        You’re off by a large margin. I’ll use two well documented examples.

        Whatsapp native used about 300mb with large chats. Cpu usage stayed relatively low and constant. Yes it wasn’t great but that’s a separate issue. The new webview2 version hits over a gig and spikes the cpu more than some of my games.

        Discord starts at 1gb memory usage and exceeds 4gb during normal use. That’s straight from the developers. It’s so bad they have started rolling out an experimental update that makes the app restart itself when it hits 4gb.

        These are just two electron apps meant just for chatting mostly. That’s up to 5Gb with just those two apps. Electron and webview2 both spin up full node.js servers and multiple JavaScript heaps plus whatever gpu threads they run, and are exceedingly bad at releasing resources. That’s exactly why they are the problem. Yes the actual JavaScript bundles discord and Whatsapp use are probably relatively small, but you get full chromium browsers and all of their memory usage issues stacked on top.

        • PeriodicallyPedantic@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          6 minutes ago

          Right
          But those are only problems because they use the resources in the background. When the foreground app uses a lot of resources it’s not a problem because you only have one foreground app at a time (I know, not really, but kinda). Most apps don’t need to run in the background.

  • ChillPC@programming.dev
    link
    fedilink
    arrow-up
    26
    arrow-down
    1
    ·
    7 hours ago

    You fool, humans are flexible enough to get used to slow experiences. Even if the average user needs to have discord, slack, 100 chrome tabs, word and any other electron app opened simultaneously, he will just go through his work. He may not be happy with it but still continue without changing his habits.

    But to be honest, I goddamn hope you are right!

    • atro_city@fedia.io
      link
      fedilink
      arrow-up
      5
      ·
      6 hours ago

      Why do you believe so? Do you believe software developers earn too much to care about RAM prices and will continue to write software that requires more RAM than the rest of the world can afford?

      • drcobaltjedi@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        4 hours ago

        As a software dev, theres a lot of stuff thats just bloat now. Electron apps are really easy to make pretty and write for web devs and are super portable, but each one is literally an instance of a chrome browser. Theres still a lot of devs that care (to some degree) about performance and are willing to trim fat or take small shortcuts where viable.

        However theres also the issue of management. I once was tasked with a problem at work dealing with the traveling salesman problem. I managed to make a very quick solution that worked fairly well and was fast but always left 1 point for last that probably should have been like point 3. Anyway, it was quick and mostly accurate, but my boss told me to “fix it” and in spite of my explaination that hes asking me to solve an unsolved math problem he persisted. I am now ashamed of how slow that operation is now since instead of just finding the nearest point it now needs to look ahead a few steps to see what path is shorter.

      • CarbonatedPastaSauce@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 hours ago

        Because that kind of shift in mindset (going backwards, basically) will require far more pressure than a 1-2 year RAM shortage.

        Enterprise developers are basically unaffected by this. And anyone writing software for mom & pop was already targeting 8gb because that’s what Office Depot is selling them.

        This mostly hurts the enthusiast parts of tech. Most people won’t notice, because they don’t know the difference between 8, 16, or over 9000 gb of RAM. I’ve had this discussion with ‘users’ so many times when they ask for pc recommendations, and they just don’t really get it, or care.

      • bufalo1973@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        I remember in the 80s a PC programmer that did his programs in GWBASIC and when I asked him why was he using that instead of a better language that could make faster a smaller programs his answer was “if this doesn’t run fast enough in the client’s PC then the client will buy a better PC”. That’s the mindset, “it’s not my problem once I sell it”.

  • vala@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    3 hours ago

    Sometimes I also think there is no one competent left at Microsoft anymore but they still have their flight sim team so I guess that’s something.

    • treadful@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      Isn’t Microsoft just the publisher? Also, there’s so many problems with MSFS.

      • badgermurphy@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        6 hours ago

        For the most part, the answer seems to be yes. Some products did also ship with missing or reduced feature sets for a time, too.

      • magic_lobster_party@fedia.io
        link
        fedilink
        arrow-up
        6
        ·
        6 hours ago

        Dealing with memory usage will likely require significant rewrites and architectural changes. It will take years.

        The ”memory optimizations” we’ll see is the removal of features but charge the same price. Software shrinkflation. Will require same amount of memory though.

  • Tehhund@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 hours ago

    Misaligned incentives. The people making bloated software are not the people buying the Ram. In theory the people buying the ram are the same people buying the software and so might put pressure on the people making the software to make it more efficient, but that is a very loose feedback loop and I wouldn’t hold my breath.

  • bklyn@piefed.social
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    edit-2
    6 hours ago

    I opened Photoshop, and I left it open with no document open. Just the main window. It started at 11 GB of RAM and went up to 28 gb without me doing anything.

    If there was anything that was as good as Photoshop, I’d have switched years ago. But I’ve tried the alternatives, and they’re just is nothing like it. Same for InDesign. Affinity photo is really really close, but it’s just not the same.

    I’ve been using Photoshop for over 30 years. Even when the time comes, making the switch will be very difficult.

    edit: I just tried opening PS again and letting it sit. it’s hovering around 3-3.5GB of ram usage. I think that last attempt was a fluke.

    • wltr@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      ·
      6 hours ago

      Is there something specific you do? I hated Gimp for eternity, but PhotoGimp plugin paired with the newest (v3) Gimp isn’t that bad. I do enjoy the experience, mostly. Perhaps I just got used to it, but it’s quite usable for me.

      • bklyn@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 hours ago

        I do a few different types of graphic design and photo editing for some different clients, but it’s more about the workflows I’ve worked out in PS. GiMP does things very differently, and I think that’s why a lot of PS users hate it. I sure do. it took me years to master PS. I don’t want to have to go through all of that again, and I certainly don’t have the time.

        I will check out the plugin you mentioned. I’ve heard of it, but didn’t think much of it before.

        • wltr@discuss.tchncs.de
          link
          fedilink
          arrow-up
          5
          ·
          6 hours ago

          I think I have something same, I absolutely hate Gimp for many years. The plugin does not change much, so do not expect miracles. Yet, it helps me tolerate the pain of an absolutely awful UX. Yet, you still need to relearn. Also, check out Krita, it’s good, and is similar to Photoshop in many regards. What I do, is simply edit some images for web development. So it’s not much, and that’s how I can tolerate Gimp.

          • bklyn@piefed.social
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 hours ago

            I actually have Krita. I use it for creating digital art with my tablet. I don’t really use it for editing/compositing like PS.

    • JohnnyEnzyme@piefed.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 hours ago

      Just curious-- what would you say are the main ways in which modern GIMP doesn’t live up to PS?

      @pantherina@feddit.org,

      Windows 11 often requires new hardware.

      This was true in my case, but it was also true that I’d been using a 10yr old machine, which is pretty ridiculous. Win10 was creaking along, and Firefox wasn’t helping. So, ahead of the deadline, I got myself a ~US$350 mini-computer with modern AMD processor and 16gb. It’s been flying.

      So it was a comparatively tiny investment to stay with a modern machine, and also helpful in maintaining a chain of redundancy. (i.e. if this one has a problem for whatever reason, I have a temporary backup machine) So in a way, the Win11 jump actually helped me out a lot.

      Checking just now, the computer has gone up US$50 since then.

      • Miles O'Brien@startrek.website
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 hours ago

        I started out using PS but when they decided to be a subscription model, I started using GIMP.

        I can’t stand trying to use photoshop anymore, and while I would love the user experience to be improved and the interface to be a little more intuitive, I’ve never been able to not do something I needed to do on gimp.

        Maybe my needs are more simple than a lot of people here, I’m definitely not a photographer so if you’re using it often enough, I suppose it could be better to use photoshop.

        • JohnnyEnzyme@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          Similar situation here, Chief; altho I got out of PS well before the absurd subscription hurdle.

          It was definitely a powerhouse, but GIMP has been sufficient for me, too. I only use it sporadically these days, but GPT5.2 has been useful in helping me when there’s 3-4 ways to do something, and I simply want to know which is easiest and most efficient. Of course, it doesn’t yet talk in Majel Barrett’s voice yet…

          • Miles O'Brien@startrek.website
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 hours ago

            Of course, it doesn’t yet talk in Majel Barrett’s voice yet…

            That can be arranged. I’ve heard they recorded her saying a bunch of frequently used words/phrases, as well as phonetic sounds, so that her voice could be used in the future for the computer. I’m not sure how true that is, but I am a little afraid to look it up.

            I’ve always hoped to hear her in the new trek shows.

            • JohnnyEnzyme@piefed.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 hours ago

              That reminds me… about 10-15yrs ago, someone scripted and CG-animated all new TOS episodes using audio samples of the original show, uploading them to YT. Me, I found the scripts surprisingly strong and the audio surprisingly effective and smooth. Unfortunately, the animation was by far the weakest link, but that was a long time ago in terms of advances. I’d love to see a modern effort.

              Somewhat similarly, I love how these turned out:

              TNG in TAS style:
              https://www.youtube.com/watch?v=Jyz2pVqrEkI

              VOY in TAS style:
              https://www.youtube.com/watch?v=luEDui2zAUw

        • JohnnyEnzyme@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          Well yes, Windows is pretty disastrous of an OS in the first place.

          FWIW, I’m hoping to get Linux installed at some point.

      • morto@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 hours ago

        but it was also true that I’d been using a 10yr old machine, which is pretty ridiculous

        Why do you think it’s ridiculous to use a 10yo machine?

      • bklyn@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        6 hours ago

        interesting you should mention your old machine. I recently upgraded my 2016 MacBook Pro to a M4 Pro MBP. Photoshop ran fine on the old one. Great, actually. The only difference I noticed was that PS launches faster and opens files faster, buuuut… that’s about it. PS already ran fine on my 9 y/o MBP. the new machine didn’t improve much, other than the RAM, which allows me to have more large docs open at once.

        my last machine only had 16GB of ram, whereas this new one has 48GB (the max for the MBP). still, the performance is pretty close-- although my old machine would probably struggle if I had a bunch of large PSDs open.

        as an investment, I didn’t really have a choice-- my old MBP died (ssd fried). I love this new machine, tho. it’s very fast.

  • pr06lefs@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    6 hours ago

    Where I’m eyeing resource usage is in the cloud right now. I run a few discourse instances which seem really inefficient to me - 1.5G ram for just a discussion board. I have to dedicate a server for each one, whereas my rust web servers can have more like 30meg usage. Probably doing a lot less stuff, but still.