• NannerBanner@literature.cafe
    link
    fedilink
    English
    arrow-up
    38
    ·
    2 days ago

    Lol, I feel bad for anyone new to the pc building community. At least those of us with 10+ year old computers at this point can play most of the indie games coming out. I AM still surprised by how intensive some games can be when they look like minecraft downgrades.

    • lightnsfw@reddthat.com
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      Kind of funny story, I launched Stardew valley yesterday and my displays absolutely shit themselves even though my graphics card is pretty new. Turned out that nvidias stupid app had changed the display settings to something weird. I had to manually flip it back to borderless and that fixed it but at first I was like “how out of everything I’ve played is this the one having problems?”

      • 87Six@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Tx I’ll send this to bro that still thinks AMD is the one that makes shit drivers

      • AmbientChaos@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        My wife started Stardew valley the other day and we also had display issues trying to output 4K. Still had to max out the zoom and even then the dialog boxes are cut off until you zoom out. Unlucky

        • NannerBanner@literature.cafe
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          Probably not the issue, but if you’re outputting 4k, is it to a tv? I had an issue with a big screen having its own weird zoom settings (the old widescreen/cinema/whatever).

          • AmbientChaos@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            23 hours ago

            We’re outputting to a projector, but the UI is what’s actually cutting it off unfortunately. I’m sure there is fix out there somewhere with a game this popular, I just need to do some digging 😁

  • trashcroissant@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    84
    ·
    2 days ago

    Jesus, I had to do a double take because I thought the stick person had somehow trapped a little human inside a pod for their entertainment and I was so confused.

  • bigchungus@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    33
    ·
    2 days ago

    You see, the big mistake in 2029 was the person installing Windows. Now they can see the horrible data center right outside of their house. As they say, out of sight, out of mind.

  • real_squids@sopuli.xyz
    cake
    link
    fedilink
    English
    arrow-up
    35
    ·
    2 days ago

    You can pry my “double” slot GPU out of my cold dead hands (good luck trying to run away with it, it’s heavy as fuck and needs a supporting post)

    • autriyo@feddit.org
      link
      fedilink
      English
      arrow-up
      17
      ·
      2 days ago

      Tbh, the whole card format feels very legacy, even for my Vega 56 “dual slot” card, and that thing “only” consumes ~230W.

      If ppl back then could’ve foreseen what obscenely power hungry parts would be shoehorned into the expansion card format, they probably would’ve chosen a different approach for GPUs specifically.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 days ago

        iGPUs should have been a better option, but they were hamstrung by PCI conventions and graphics APIs favoring discrete VRAM.

        (Just look at how x86 SoC consoles run circles around similar-spec PCs.)

        I’m hoping that ARM is a chance to reset.

        • autriyo@feddit.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          I do like the modularity of discrete GPUs though.

          But a cooling setup similar to CPUs would’ve been better for airflow.

          • kibiz0r@midwest.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 days ago

            Modularity is nice — both for personal preference reasons and incentivizing-market-competition reasons — but it does come at a cost.

            The thing is: even in our modular world right now, you don’t really have many choices. Two CPU companies, three GPU companies (two of them being the same as the CPU companies)…

            We could someday have a world where PC hardware is technically less modular than it is today but consumers have more choices in the marketplace than they do today.

          • real_squids@sopuli.xyz
            cake
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            Kinda hard to do when so many GPU vendors slap their memory and power circuits all over the place. Even if the die is in the same place cooler manufacturers would need to test fit a bajillion models, and on top of that they’d need insane R&D budgets to keep up with new additions, sometimes coming years after the original gpu comes out

            • autriyo@feddit.org
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              My concerns were more about the airflow path and less about actually interchangeable coolers, although those are a thing.

              Like it just feels wrong to blast the air into a solid PCB… That’s kind of solved with flow through designs, but a tower style cooler would probably be less noisy.

      • real_squids@sopuli.xyz
        cake
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        2 days ago

        I was joking about it being 2.5 slots, tbf most modern cards should be triple slots. Mine is 300W and it’s pretty chunky to stay below 60C, best option for big cards is a horizontal mobo imo

        edit: unpopular opinion but I’d rather have a chunky card that stays cool as fuck than a slim one, that’s why I picked up the Nitro when I had a 6650XT

  • rose56@lemmy.zip
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    I guess we won’t do something about that, especially when we have the power in our hands.

  • wraekscadu@vargar.org
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    2 days ago

    Supply will catch up with demand. High PC component prices are a temporary thing.

    • Skullgrid@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      2 days ago

      Supply will catch up with demand. High PC component prices are a temporary thing.

      we said that about housing since 2008.

    • Wintry@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      Maria dear i think this may be your one character flaw. Ai is a huge force for evil. It’s a tool. And sure tools can be useful but they way it’s being wielded is to the detriment of humanity. It’s not a cute novelty anymore. It’s not being used appropriately at any scale. I know you are fond of some of the silly and fun use cases. And it probably still has limited ethical uses. But as it stands the power of comput is being used for propaganda repression of people and genocide. To dumb and pacify the population and with the way hardware is going the corporate fascist may kill personal computing. It’s not worth discussing the good it might be able to do when it’s actively harmful to humanity and only helpful to the controlling and ruling class. It’s really sweet and endearing you are so optimistic but sadly it’s an evil world and the point of this technology is to serve the interests of bad people and any good it can do will be for their profit and control not for the lives well being snd freedom of all of us.

      • maria [she/her]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        54 minutes ago

        i thought about how to reply to this a bunch… because like - ur totally right-- but also i felt somewhat unnecessarily attacked >~<

        the controversial post i would have made would have been with the following content

        content of post

        i would have drawn an angry man hitting a computer, with the text of my comment:

        somethingsomething

        AI BAD!

        in a round kind of way ~
        with bottom text

        (now go and like my post)

        the goal would be to make it look as if it were generated using openais image gen 1 model, so i would have

        • looked at some generations from that model to match that cartoon style
        • made sure to make some of the text slightly mangled
        • overlayed a tea-color and dense noise filter which that model is known for

        similar like how another post tried to, this one would essentially bait for being flagged as ai… dunno why, but i thought that was funny. it also would have made fun of people actually posting their low quality slop made using the most lazy prompting.

        it also would have played on the “hate-post” thing, which tends to amount looooots of likes and stuff ~
        (see all the “the bubble is bursting” and “ai makes u dumb” videos)


        sketch of post

        a sketch i made just for this comment <3


        You are absolutely right ✅—the stuff isnt a fun novelty anymore and its primarily being used by bad people for bad things and now even in war apparently.

        so like… its not like im pro-ai, see this post- im very much aware of the evil and the bad happening, more so than most anti-ai peeps i believe.

        ive been around the bubble because i find the topic intellectually stimulating. the idea of curating some data to let a prediction mechanism grab onto those patterns to predict following sequences-- just seems like an interesting engineering process, especially when it comes to behavioural analysis.
        but over time ive been disappointed, as the majority of that bubble is made of disgusting people, business people and spammers with only a few people interested in doing fun research.

        over the last year or so, ive been moving away from the ai bubble. removing discords, bookmarks, yt channels and even local language models from my devices (besides my phones keyboard auto-correct… i wont let u peeps take that away from me!)

        i did post plenty stuff on lemmy concerning the topic— but like - all of em were either jokes, technical stuff or ideas (correct me if im wrong, may have forgotten something)

        i do have a character flaw, but not the one you portrayed:

        the character flaw i believe i have

        when i see a topic being dominated by one opinion, i usually want to know where the other side is coming from. why do people vote for trump? why would anyone be pro-ai? who in their right mind would not want to at least try pineapple on pizza? then once i understand the other side (which is usually more nuanced than the “right side” likes to imagine) makes a lot more sense to me.

        this is where imma leave it i think… might still make that post thofor the funzies-

        (yes, the comment was generated entirely by a human brain connected to a human body)

        EDIT: for anyone whos not aware: i dont dabble with image, video and audio generation stuff… not my thing.