• Michal@programming.dev
    link
    fedilink
    arrow-up
    3
    ·
    1 hour ago

    PCs aren’t faster, they have more cores, so they can do more at a time, but it takes effort to optimize for parallel work. Also the form factor keeps getting smaller, more people use laptops now and you can’t cheat thermal efficiency.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    62
    ·
    13 hours ago

    The modern web is an insult to the idea of efficiency at practically every level.

    You cannot convince me that isolation and sandboxing requires a fat 4Gb slice of RAM for a measly 4 tabs.

  • Flames5123@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    14 hours ago

    For my home PC, sure. Running some windows apps on my Linux machine in wine is a little weird and sluggish. Discord is very oddly sluggish for known reasons. Proton is fine tho.

    But for my work? Nah. My M3 MacBook Pro is a beast compared to even the last Intel MacBook. Battery is way better unless you’re like me and constantly running a front end UI for a single local service. But without that, it can last hours. My old one could only last 2 meetings before it started dying.

    • prime_number_314159@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      4 hours ago

      Apple put inadequate coolers in the later Intel Macbooks to make Apple Silicon feel faster by contrast. When I wake mine, loading the clock takes 1.5 seconds, and it flips back and forth between recognizing and not recognizing key presses in the password field for 12 seconds. Meanwhile, the Thinkpad T400 (running Arch, btw) that I had back in 2010 could boot in 8.5 seconds, and not have a blinking cursor that would ignore key presses.

      Apple has done pretty well, but they aren’t immune from the performance massacre happening across the industry.

      The battery life is really good, though. I get 10-14 hours without trying to save battery life, which is easily enough to not worry about whether I have a way to charge for a day.

  • kamen@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    16 hours ago

    I paid for the whole amount of RAM, I’m gonna use the whole amount of RAM.

    /s

    Joke aside, the computer I used a little more than a decade ago used to take 1 minute just to display a single raw photo. I’m a liiiittle better off now.

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 hour ago

      used to take 1 minute just to display a single raw photo

      See, that’s a great example!

      RAW processing (at least in that context) hasn’t really changed in 10 years. It’s probably the same code doing all the heavy lifting.

      But most software doesn’t have that benefit.

      • kamen@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        It was a s. 754 Sempron at a time when people were already running Core 2 Duos ans Quads.

        • sip@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 hours ago

          sory for making you feel old…er.

          i7 4th gen/ haswell was 13 years ago. I still use it.

          that sempron is probably more than 17 years ago.

          I had an athlon xp 2000+, single core. OC to 2666MHz with proper thermals

          • kamen@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 hours ago

            I got that PC in high school and had to run it a bit afterwards because I didn’t have the money for a new one. When eventually I got around to replacing it, I got an X99/Haswell-E system and it was a night and day difference.

  • AeonFelis@lemmy.world
    link
    fedilink
    arrow-up
    35
    arrow-down
    1
    ·
    19 hours ago

    Thought leaders spent the last couple of decades propaganding that features-per-week is the only metric to optimize, and that if your software has any bit of efficiency or quality in it that’s a clear indicator for a lost opportunity to sacrifice it on the alter of code churning.

    The result is not “amazing”. I’d be more amazed had it turned out differently.

    • SanicHegehog@lemmy.world
      link
      fedilink
      arrow-up
      22
      ·
      15 hours ago

      Fucking “features”. Can’t software just be finished? I bought App. App does exactly what I need it to do. Leave. It. Alone.

      • Yaky@slrpnk.net
        link
        fedilink
        arrow-up
        3
        ·
        3 hours ago

        No, never! Tech corps (both devs and app stores) brainwashed people into thinking “no updates = bad”.

        Recently, I have seen people complain about lack of updates for: OS for a handheld emulation device (not the emulator, the OS, which does not have any glaring issues), and Gemini protocol browser (gemini protocol is simple and has not changed since 2019 or so).

        Maybe these people don’t use the calculator app because arithmetic was not updated in a few thousand years.

        • vala@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          1 hour ago

          A big part of this issue is mobile OS APIs. You can’t just finish an android app and be done. It gets bit rot so fast. You get maybe 1-2 years with no updates before “this app was built for an older version of android” then “this app is not compatible with your device”.

    • ChickenLadyLovesLife@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      19 hours ago

      It’s kind of funny how eagerly we programmers criticize “premature optimization”, when often optimization is not premature at all but truly necessary. A related problem is that programmers often have top-of-the-line gear, so code that works acceptably well on their equipment is hideously slow when running on normal people’s machines. When I was managing my team, I would encourage people to develop on out-of-date devices (or at least test their code out on them once in a while).

      • AnyOldName3@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        Premature optimisation often makes things slower rather than faster. E.g. if something’s written to have the theoretical optimal Big O complexity class, that might only break even around a million elements, and be significantly slower for a hundred elements where everything fits in L1 and the simplest implemention possible is fine. If you don’t know the kind of situations the implementation will be used in yet, you can’t know whether the optimisation is really an optimisation. If it’s only used a few times on a few elements, then it doesn’t matter either way, but if it’s used loads but only ever on a small dataset, it can make things much worse.

        Also, it’s common that the things that end up being slow in software are things the developer didn’t expect to be slow (otherwise they’d have been careful to avoid them). Premature optimisation will only ever affect the things a developer expects to be slow.

      • G_M0N3Y_2503@lemmy.zip
        link
        fedilink
        arrow-up
        4
        arrow-down
        3
        ·
        18 hours ago

        Optomisation often has a cost, weather it’s code complexity, maintenance or even just salary. So it has to be worth it, and there are many areas where it isn’t enough unfortunately.

          • G_M0N3Y_2503@lemmy.zip
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            6 hours ago

            How is that mindset lazy? Unhappy customers also have a cost! At my last job the customer just always bought hardware specifically for the software as a matter of process, partly because the price of the hardware compared to the price of the software was negligible. You literally couldn’t make a customer care.

            • prime_number_314159@lemmy.world
              link
              fedilink
              arrow-up
              5
              ·
              4 hours ago

              In industrial software, I’m sure performance is a pretty stark line between “good enough” and “costing us money”.

              The pattern I’ve seen in customer facing software is a software backend will depend on some external service (e.g. postgres), then blame any slowness (and even stability issues…) on that other service. Each time I’ve been able to dig into a case like this, the developer has been lazy, not understanding how the external service works, or how to use it efficiently. For example, a coworker told me our postgres system was overloaded, because his select queries were taking too long, and he had already created indexes. When I examined his query, it wasn’t able to use any of the indexes he created, and it was querying without appropriate statistics, so it always did a full table scan. All but 2 of the indexes he made were unused, so I deleted those, then added a suitable extended statistics object, and an index his query could use. That made the query run thousands of times faster, sped up writes, and saved disk space.

              Most of the optimization I see is in algorithms, and most of the slowness I see is fundamentally misunderstanding what a program does and/or how a computer works.

              Slowness makes customers unhappy too, but with no solid line between “I have what I want” and “this product is inadequate”.

            • Passerby6497@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 hours ago

              How is that mindset lazy?

              Are you really asking how it’s lazy to pass unoptimized code to a customer and make their hardware do all the work for you because optimization was too costly?? Like I get that you are in an Enterprise space, but this mentality is very prevalent and is why computers from today don’t feel that much faster software wise than they did 10 years ago. The faster hardware gets, the lazier devs can be because why optimize when they’ve got all those cycles and RAM available?

              And this isn’t a different at you, that’s software development in general, and I don’t see it getting any better.

              • racemaniac@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                1
                ·
                4 hours ago

                It’s not just software development, it’s everywhere. Devices are cheap, people are expensive. So it’s not lazy, he’s being asked to put his expensive time into efforts the customer actually wants to pay for. If having him optimize the code further costs way more than buying a better computer, it doesn’t make sense economically for him to waste his time on that.

                Is that yet another example of how the economy has strange incentives? For sure, but that doesn’t make him lazy.

                • Passerby6497@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  edit-2
                  4 hours ago

                  I never called them lazy, I stated that the mentality is lazy, which it is. Whether or not that laziness is profit driven, it still comes down to not wanting to put forth the effort to make a product that runs better.

                  Systemic laziness as profit generation is still laziness. We’re just excusing it with cost and shit, and if everyone is lazy, then no one is.

                  If cost is a justification for this kind of laziness, it also justifies slop code development. After all, it’s cheaper that way, right?

  • ssfckdt@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    15
    ·
    17 hours ago

    The program expands so as to fill the resources available for its execution

    – C.N. Parkinson (if he were alive today)

  • merc@sh.itjust.works
    link
    fedilink
    arrow-up
    20
    ·
    19 hours ago

    You do really feel this when you’re using old hardware.

    I have an iPad that’s maybe a decade old at this point. I’m using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don’t know if it’s the browser or the pages or both, but most web sites are unbearably slow, and some simply don’t work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can’t update some of the apps. But, that’s a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.

    • ssfckdt@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      20
      ·
      17 hours ago

      It’s the pages. It’s all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn’t exactly a computationally modest language.

      Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.

      Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.

      • 87Six@lemmy.zip
        link
        fedilink
        arrow-up
        6
        ·
        16 hours ago

        I don’t agree. It’s both. I’ve opened basic no JS sites on old tablets to test them out and even those pages BARELY load

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            12 hours ago

            Probably just the browser itself, considering how bloated they’re getting. It’s not super surprising, considering the apps run about as fast (on a good day) as it did 5-10 years ago on a new phone, it’s gonna run like dogshit on a phone from that era.

    • NecroParagon@midwest.social
      link
      fedilink
      arrow-up
      4
      ·
      18 hours ago

      I can’t update YouTube on my iPad 2 that I got running again for the first time in years. It said it had been 70,000~ hours since last full charge. I wanted to use it to watch videos on when I’m going to bed. But I can’t actually login to YouTube because the app is so old and I seemingly can’t update it.

      I was using the web browser and yeah I don’t remember it being so damn slow. It’s crazy how that is.

      • Yaky@slrpnk.net
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        Is your iPad on iOS 9.3.5? It is infamously slow.

        It is possible to downgrade it to 8.4.1 (faster, partially more broken) or even 6.1.3 (fast and old school, many apps don’t work, but there are apps in Cydia to fix stuff).

        Biggest issue I encountered is sites requiring TLSv1.3 for HTTPS encryption, and browsers simply do not support that.

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        16 hours ago

        I have an old YouTube app on my iPad, and it still works fine. One of the more responsive apps on the device. I get nagged nearly every time I use it to update to the newest YouTube release, but that’s impossible. I’d first have to upgrade my OS, and Apple no longer releases new OSes for this generation of iPads. So, I’m stuck with an old YouTube, which mostly works fine, and an occasional nag message.

        I’m sure within a year or two mine will be like yours and YouTube will simply no longer work. But, for now it’s in a relatively good spot where I can use a version of YouTube designed for this particular hardware that doesn’t feel sluggish.

  • bampop@lemmy.world
    link
    fedilink
    arrow-up
    105
    arrow-down
    1
    ·
    1 day ago

    My PC is 15 times faster than the one I had 10 years ago. It’s the same old PC but I got rid of Windows.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    85
    arrow-down
    2
    ·
    edit-2
    1 day ago

    Everything bad people said about web apps 20+ years ago has proved true.

    It’s like, great, now we have consistent cross-platform software. But it’s all bloated, slow, and only “consistent” with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.

    It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.

    But at least we’re not stuck with Windows-only admin consoles anymore, so that’s nice.

    All the advances in hardware performance have been used to make it faster (more to the point, “cheaper”) to develop software, not faster to run it.

  • DontRedditMyLemmy@lemmy.world
    link
    fedilink
    arrow-up
    44
    arrow-down
    1
    ·
    1 day ago

    I hate that our expectations have been lowered.

    2016: “oh, that app crashed?? Pick a different one!”

    2026: “oh, that app crashed again? They all crash, just start it again and cross your toes.”

    • wabasso@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      20 hours ago

      I’m starting to develop a conspiracy theory that MS is trying to make the desktop experience so terrible that everyone switches to mobile devices, such that they can be more easily spied on.

        • Yaky@slrpnk.net
          link
          fedilink
          arrow-up
          3
          ·
          3 hours ago

          Windows Phone was around in mid-2010s, at least 7 years after iPhone release. But it was not hyped enough: companies did not care to develop apps for it, customers didn’t want a smartphone without X Y Z apps (same argument i see now about mobile linux or even custom ROMs). The phones had nice and fast UI though, and some had very good cameras.

          • ChickenLadyLovesLife@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            Windows Phone was great. I’d done Windows Mobile since 2005 and it was nice to be able to continue developing with C#/.NET and Visual Studio (back when it was still good) in a more modern OS. One thing that really spoiled me permanently was being able to compile, build and deploy the app I was working on to my test device effectively instantaneously – like, by the time I’d moved my hand over to the device, the app was already up and running. Then I switched to iOS where the same process could take minutes, also Blackberry where it might take half an hour or never happen at all.

            Funny thing: RIM was going around circa 2010/2011 offering companies cash bounties of $10K to $20K to develop apps for Blackberry, since they were dying a rapid death but were still flush with cash. Nobody that I know of took them up on the offers. I tried to get my company to make a Windows Phone version of our software but I was laughed at (and deservedly so).

  • brotato@slrpnk.net
    link
    fedilink
    arrow-up
    123
    arrow-down
    3
    ·
    1 day ago

    The tech debt problem will keep getting worse as product teams keep promising more in less time. Keep making developers move faster. I’m sure nothing bad will come of it.

    Capitalism truly ruins everything good and pure. I used to love writing clean code and now it’s just “prompt this AI to spit out sloppy code that mostly works so you can focus on what really matters… meetings!”

  • Valmond@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    18
    ·
    21 hours ago

    Had to install (an old mind you, 2019) visual studio on windows…

    First it’s like 30GB, what the hell?? It’s an advanced text editor with a compiler and some …

    Crashed a little less than what I remember 🥴😁

      • shynoise@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        20 hours ago

        OP was clearly using a rhetorical reduction to make a point that VS is bloated.

      • Valmond@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        3
        ·
        20 hours ago

        Visual code is another project, visual studio is indeed an IDE but it integrates it all. Vscode is also an integrated development environment. I don’t really know what more to say.

        • The Stoned Hacker@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          18 hours ago

          VS Code is considered a highly extensible text editor that can be used as an IDE, especially for web based tools, but it isnt an IDE. It’s more comparable to Neovim or Emacs than to IntelliJ in terms of the role it’s supposed to fill. Technically. VS Code definitely is used more as an IDE by most people, and those people are weak imo. I’m not one to shill for companies (i promise this isnt astroturf) but if you need to write code Jetbrains probably has the best IDE for that language. Not always true but moee often than not it is imo.

          • Valmond@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            2
            ·
            8 hours ago

            Ooh, a flame war 🔥🔥🔥 ! It has been so long since I was involved in one, thank you 🙋🏻‍♀️! 😊

            Who uses visual code to something else than writing and launching vode? I only uses it for C#/Godot on Linux but it has all the bells and whistles to make it an IDE IMO (BTW anyone who doesn’t code in C/C++ is weak ofc ☺️! 🔥).

            Let me just add that jetbrains (at least pycharm) have started their enshittification death cycle, and I’m looking for a lightweight python IDE that doesn’t hallucinate (but lets you use venm and debug), if you have any ideas I’m listening!

            Cheers

            • The Stoned Hacker@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              7 hours ago

              I wanna clarify that when i say VS Code I’m talking about Visual Studio Code. I was only commenting on the difference between Visual Studio and Visual Studio Code because you said you downloaded Visual Studio and was confused why a text editor was 30gb, and it’s possible you downloaded the IDE rather than the text editor. I apologize if you thought i was talking about Visual Code; I wasn’t.

              And i agree that JetBrains has started to enshittify but I also think their enshittification has been pretty slow because they sell professional tools that still have to perform the basic functionality of an IDE. And for the modt part I’ve been able to disable all AI features save the ones I’m required to use at work (yay AI usage metrics ;-;)

  • kunaltyagi@programming.dev
    link
    fedilink
    arrow-up
    70
    arrow-down
    1
    ·
    1 day ago

    The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.

    Switching from an old system with old UI to a new system sometimes feels like molasses.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      20
      ·
      1 day ago

      I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don’t understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      1 day ago

      Except for KDE. At least compared to cinnamon, I find KDE much more responsive.

      AI generated code will make things worse. They are good at providing solutions that generally give the correct output but the code they generate tends to be shit in a final product style.

      Though perhaps performance will improve since at least the AI isn’t limited by only knowing JavaScript.

      • boonhet@sopuli.xyz
        link
        fedilink
        arrow-up
        4
        ·
        1 day ago

        I still have no idea what it is, but over time my computer, which has KDE on it, gets super slow and I HAVE to restart. Even if I close all applications it’s still slow.

        It’s one reason I’ve been considering upgrading from6 cores and 32 GB to 16 and 64.

        • rumba@lemmy.zip
          link
          fedilink
          English
          arrow-up
          8
          ·
          23 hours ago

          Upgrade isn’t likely to help. If KDE is struggling on 6@32, you have something going on that 16@64 is only going to make it last twice as long before choking.

          wail till it’s slow

          Check your Ram / CPU in top and the disk in iotop, hammering the disk/CPU (of a bad disk/ssd) can make kde feel slow.

          plasmashell --replace # this just dumps plasmashell’s widgets/panels

          See if you got a lot of ram/CPU back or it’s running well, if so if might be a bad widget or panel

          if it’s still slow,

          kwin_x11 --replace

          or

          kwin_wayland --replace &

          This dumps everything and refreshes the graphics driver/compositor/window manager

          If that makes it better, you’re likely looking at a graphics driver issue

          I’ve seen some stuff where going to sleep and coming out degrades perf

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            12 hours ago

            I’ve seen some stuff where going to sleep and coming out degrades perf

            I’ll have to try some of these suggestions myself, as I’ve been dealing with my UI locking up if the monitors turn off and I wake it up too soon. Sometimes I still have ssh access to it, so thanks for the shell commands!

            • rumba@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 minutes ago

              I was doing horrible things the other day and ended up with my KDE login page not working when I came out of sleep.

              CTRL+ALT+F2 > text login > loginctl unlock-sessions

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            2
            ·
            23 hours ago

            Hmm, I haven’t noticed high CPU usage, but usually it only leaves me around 500MB actually free RAM, basically the entire rest of it is either in use or cache (often about 15 gigs for cache). Turning on the 64 gig swapfile usually still leaves me with close to no free RAM.

            I’ll see if it’s slow already when I get home, I restarted yesterday. Then I’ll try the tricks you suggested. For all I know maybe it’s not even KDE itself.

            Root and home are on separate NVMe drives and there’s a SATA SSD for misc non-system stuff.

            GPU is nvidia 3060ti with latest proprietary drivers.

            The PC does not sleep at all.

            To be fair I also want to upgrade to speed up Rust compilation when working on side projects and because I often have to store 40-50 gigs in tmpfs and would prefer it to be entirely in RAM so it’s faster to both write and read.

            • rumba@lemmy.zip
              link
              fedilink
              English
              arrow-up
              5
              ·
              23 hours ago

              Don’t let me stop you from upgrading, that’s got loads of upsides. Just suspecting you still have something else to fix before you’ll really get to use it :)

              It CAN be ok to have very low free ram if it’s used up by buffers/cache. (freeable) If Buff/cache gets below about 3GB on most systems, you’ll start to struggle.

              If you have 16GB, it’s running low, and you can’t account for it in top, you have something leaking somewhere.

              • boonhet@sopuli.xyz
                link
                fedilink
                arrow-up
                4
                ·
                20 hours ago

                Lol I sorted top by memory usage and realized I’m using 12 gigs on an LLM I was playing around with to get local code completion in my JetBrains IDE. It didn’t work all that well anyway and I forgot to disable it.

                I did have similar issues before this too, but I imagine blowing 12 gigs on an LLM must’ve exacerbated things. I’m wondering how long I can go now before I’m starting to run out of memory again. Though I was still sitting at 7 gigs buffer/cache and it hadn’t slowed down yet.

                • rumba@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  20 hours ago

                  12/16, That’ll do it. Hopefully that’s all, good luck out there and happy KDE’ing

        • dr_robotBones@reddthat.com
          link
          fedilink
          arrow-up
          1
          ·
          21 hours ago

          Have you gone through settings and disabled unnecessary effects, indexing and such? With default settings it can get quite slow but with some small changes it becomes very snappy.

          • AdrianTheFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            14 hours ago

            I have a 2 core, 2 thread, 4gb RAM 3855u Chromebook that I installed Plasma on, and it’s usually pretty responsive.

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            ·
            20 hours ago

            I have not, but also it’s not slow immediately, it takes time under use to get slow. Fresh boot is quite fast. And then once it’s slow, even if I close my IDE, browsers and everything, it remains slow, even if CPU usage is really low and there’s theoretically plenty of memory that could be freed easily.

        • arendjr@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          22 hours ago

          Have you tried disabling the file indexing service? I think it’s called Baloo?

          Usually it doesn’t have too much overhead, but in combination with certain workflows it could be a bottleneck.

  • Bwaz@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    15 hours ago

    That’s not “programmer_humor”, that’s an absolute fact. Had to go on an ancient laptop here for an old lost file recently, it was WAY faster than a new ultra speedy decked out recent build Win12 machine. WTAF??

  • GreenShimada@lemmy.world
    link
    fedilink
    arrow-up
    262
    arrow-down
    2
    ·
    1 day ago

    For anyone unsure: Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

    Case in point: AI models could be written to be more efficient in token use (see DeepSeek), but instead AI companies just buy up all the GPUs and shove more compute in.

    For the expansive bloat - same goes for phones. Our phones are orders of magnitude better than what they were 10 years ago, and now it’s loaded with bloat because the manufacturer thinks “Well, there’s more computer and memory. Let’s shove more bloat in there!”

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      28
      ·
      1 day ago

      Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

      More specifically, it’s when an improvement in efficiency cause the underlying resource to be used more, because the efficiency reduces cost and then using that resource becomes even more economically attractive.

      So when factories got more efficient at using coal in the 19th century, England saw a huge increase in coal demand, despite using less coal for any given task.

      • Quetzalcutlass@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        21 hours ago

        Also Eli Whitney inventing the cotton gin to make extracting cotton less of a tedious and backbreaking process, which lead to a massive expansion in slavery plantations in the American South due to the increased output and profitability of the crop.

      • shrugs@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        14 hours ago

        This happens not only with efficiency gains. There is risk overcompensation, which feels kinda the same. Cars that are more secure cause reckless driving, which in turn is the reason accidents happen more often, which eat into the safety gains.

    • VibeSurgeon@piefed.social
      link
      fedilink
      English
      arrow-up
      80
      arrow-down
      1
      ·
      1 day ago

      Case in point: AI models could be written to be more efficient in token use

      They are being written to be more efficient in inference, but the gains are being offset by trying to wring more capabilities out of the models by ballooning token use.

      Which is indeed a form of Jevon’s paradox

      • errer@lemmy.world
        link
        fedilink
        English
        arrow-up
        32
        ·
        1 day ago

        Costs have been dropping by a factor of 3 per year, but token use increased 40x over the same period. So while the efficiency is contributing a bit to the use, the use is exploding even faster.

    • frunch@lemmy.world
      link
      fedilink
      arrow-up
      22
      ·
      1 day ago

      I always felt American car companies were a really good example of that back in the 60s-70s when enormously long vehicles with giant engines were the order of the day. Why not bigger? Why not stronger? It also acted as a symbol of American strength, which was being measured by raw power just like today lol.

      This also reminds me of the way video game programmers in the late 70s/early 80s had such tight limitations to work within that you had to get creative if you wanted to make something stand out. Some very interesting stories from that era.

      I also love to think about the tricks the programmer of Prince of Persia had employed to get the “shadow prince” to work…

      https://www.youtube.com/watch?v=sw0VfmXKq54