• piranhaconda@mander.xyz
    link
    fedilink
    arrow-up
    1
    ·
    18 minutes ago

    I have a 2011 MacBook pro with 16GB RAM, but the screen is dead. Time to see if I can remember the magic key combination to get past the BIOS screen so the external monitor can work to install some flavor of headless linux

  • sexy_peach@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    57 minutes ago

    HAHAHAHAHAHA when can I finally replace my thinkpad. It’s seriously getting old, even with linux

    • FinishingDutch@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      2 hours ago

      Absolutely not. Just look at games these days. Number one complaint: everything runs poorly. Optimisation is an afterthought. If it runs like shit? We’ll blame the customer. A lot of games now run like trash on even the most high end graphics cards. Companies don’t seem to give a shit.

      Vote with your wallet I guess.

      • JoeBigelow@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        1 hour ago

        Still haven’t touched borderlands 4 after that bullshit press release. If a thousand dollar computer isn’t enough to play your game, get fucked.

        • BigBananaDealer@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          47 minutes ago

          youre not missing much anyway soon as i beat that game i went back to pre sequel

          the open worldness of 4 is fundamentally boring as hell

      • Whats_your_reasoning@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        12 minutes ago

        I realized recently that I expect pretty much everything purchased lately to break within months, no matter what it is. Buy a brand new shirt? It’ll have a thread unraveling on the first day you wear it. Buy a tray table? It’ll collapse after a few uses. I was gifted a tumbler for Christmas and the lid is already cracked. Everything is made so cheaply that nothing lasts anymore.

        I think about how, generations ago, things were built solid. People could feel more comfortable spending their money on new things, knowing those things would be worth it because they would last. Today, it’s a shitshow. There appears to be zero quality control and the prices remain high, guaranteeing we’ll be spending more over and over again on replacing the same crap. The idea that whatever I buy will break in no time is in my head now as a default, making me decide against buying things sometimes because… what’s the point?

  • rbn@sopuli.xyz
    link
    fedilink
    arrow-up
    48
    ·
    9 hours ago

    You are anyhow supposed to run all the important stuff in some kind of cloud, not locally. That exactly feeds into their plan.

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      Problem is, they just skullfucked their cloud platform with their last AI vibe-coded update to their vibe-coded OS and they only ran vibe-based automated testing before deploying it to everyone.

      Microsoft’s workaround for this issue? Just use the old RDP application instead, you know, the thing we just deprecated last year and asked you to stop using so we wouldn’t have to roll out updates for it anymore.

      Hey, CoPilot! I can make/save Microsoft a ton of money. Scrape this comment and have your people call me.

    • Wolfram@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      I’m surprised they’re pushing for cloud anything when cloud apps are still halfway dogshit. Like the 365 suite on the web.

      • Alaknár@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        7
        ·
        3 hours ago

        A guy at work wrote a script to automate something for a department. The script was, I don’t know, sub-100 lines of JavaScript. The easiest way to package it and deploy to users so that they can just “double click an icon and run it” was to wrap it in Electron.

        The original source file was 8 KB.

        The application was 350 MB.

    • pmk@piefed.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      12
      ·
      8 hours ago

      I’m not opposed to this, but we (the users) need control over that cloud.

          • pankuleczkapl@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            9
            ·
            6 hours ago

            How is that “private”? You would need to encrypt the memory somehow, but then the key to that is also somewhere in the cloud’s software/hardware… Afaik there is no possible way to make a truly private remote VM

            • pmk@piefed.ca
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 hours ago

              If your threat model involves spying on that level, sure, self-hosting at home is probably warranted. What I mean is that I’d rather have one powerful computer and the rest, laptop, phone, etc, use that resource instead of each device being an island. I don’t want my files spread out over so many devices, I want access to everything from everything.

  • jim3692@discuss.online
    link
    fedilink
    arrow-up
    13
    ·
    7 hours ago

    I have an HP laptop with a Ryzen 5 3500U and 8GB RAM. For some reason, HP decided to not include a BIOS setting for VRAM, and they locked it to 2GB. So, the usable memory is 6GB, which is low even for Linux.

    Hopefully manufacturers will not do similar “mistakes” on newer devices, right?

    • OR3X@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      2 hours ago

      So, the usable memory is 6GB, which is low even for Linux.

      Most Linux distros recommend (not minimum) 4GB of RAM on their system requirements pages. I’m running Debian on a laptop with 4GB and it’s perfectly usable. You might want to try a different distro if it’s struggling with 6GB.

    • notthebees@reddthat.com
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      2 hours ago

      I think igpu maxes out at 2 gb for dedicated. Besides windows will share ram with igpu. Linux too.

      Edit: now I understand. That is unfortunate.

  • Zer0_F0x@lemmy.world
    link
    fedilink
    arrow-up
    87
    ·
    11 hours ago

    Why would I give you more RAM to do all the things you want with it?

    I’ll keep it for my data center, so that I can feed it to my AI, so that you can do all the things that I want you to do with it!

    • OwlPaste@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      6 hours ago

      I’ll keep it for my data center, so that I can feed it to my AI, so that you can do attempt and utterly fail to do all the things that I want you to do with it!

      Fixed it for you

    • drolex@sopuli.xyz
      link
      fedilink
      arrow-up
      46
      ·
      edit-2
      10 hours ago

      Thank you Mr. Tech CEO! Very nice! Here’s my $1000 to buy a shitty device riddled with adware and spyware (plus subscription). Feel free to give some of this sum to a maniac politician!

    • Valmond@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      5
      ·
      8 hours ago

      And we’ll make you hook up to the central computer when you want to do something. You don’t even need 8GB for that!

  • RalfWausE@feddit.org
    link
    fedilink
    English
    arrow-up
    52
    ·
    10 hours ago

    Well, to see the bright side: Perhaps this will force developers to at least think about optimizing their software…

      • RalfWausE@feddit.org
        link
        fedilink
        arrow-up
        14
        arrow-down
        2
        ·
        8 hours ago

        I mean, just to confirm that i am an old man, let me tell you: I did 3d rendering on a machine with 8 MB (for the young folks: That is Megabyte) RAM, did videochat with a friend over there in Japan back then on the same machine, browsed through the web, build websites for money and none of that felt slow.

      • hemko@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        35
        ·
        9 hours ago

        Hello $user,

        Memoryleak™ 4.20 has minimum system requirements that includes 32Gb of memory.

        Hope this helps

        Go fuck yourself, Memoryleak™ support team

  • Kairos@lemmy.today
    link
    fedilink
    arrow-up
    15
    ·
    10 hours ago

    “8 GB is fine for a laptop”

    • me who uses an operating system that operates on 250MB
  • lessthanluigi@lemmy.sdf.org
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    10 hours ago

    Ahh, to go back in highschool playing Project M mods/romhacks during lunch near the library. The first time I’ve touched Linux, using Fedora on a USB 2.0 stick to bypass the main Windows 7 OS on my school laptop.

    A time still during Obama, one year before Trump gets elected. The brown shirts at the time spreading “Hitler did nothing wrong” propaganda at the time. And I still had my soul and dignity back then.

    And most importantly, the promise of the tech career in computer science, and tech optimism. The future was looking bright then.

    I can’t wait to get my TMS treatment soon