• Narrrz@kbin.social
    link
    fedilink
    arrow-up
    40
    ·
    1 year ago

    the irony of the first panel being a guy at his home computer, the second, at a work computer

  • TrustingZebra@lemmy.one
    link
    fedilink
    arrow-up
    31
    arrow-down
    1
    ·
    1 year ago

    Working from home has killed PC gaming for me. I have no desire to sit at my desk after a full work day.

    • Waker@lemmy.ml
      link
      fedilink
      arrow-up
      13
      ·
      edit-2
      1 year ago

      Same. I can’t even be in the same room anymore. Somehow made gaming there feel like a chore as well…

      So bored of games, that I’ll go work on some other stuff (home servers) and just taking care of day to day life stuff. Maybe I’m just growing up, but I never thought I’d be bored of games… Ever…

      I still prefer working remotely though!

      • Obi@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I solve this by putting an Xbox downstairs and only playing games where short sessions are possible (rocket league, racing, indie games). I’d love for them to port WoW so I could grind/level up on the Xbox and go to the PC for raids though realistically even like that I probably still don’t have enough time anymore.

        • Waker@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Yeah, I’ve noticed I do feel more at ease on my living room now. But the only console I own is a Nintendo switch and I games that aren’t multiplayer bore me faster, and switch’s multiplayer games don’t really speak to me too much…

          Well you could in theory, use XPadder or something like that to play with a remote controller on PC. It’s not ideal by any means though…

          With that + steam streaming to the TV App (just add the game as non steam) you could play WoW. Not anything competitive though lol. Just questing simple stuff or easier dungeon content.

    • Fushuan [he/him]@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Working from home has empowered PC gaming for me. I just play a little when I’m stuck, and closing the work laptop screen is enough to disassociate from the working environment. I even have a kvm and reuse the same screens, same keyboard, same mouse.

    • kameecoding@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      yep if I game it’s in the Living room on ps5.

      but mostly I rather just relax or exercise, but nowadays gaming feels much like an escapism that I don’t even enjoy just do it to procrastinate

  • steve228uk@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    I’m literally at the same computer at I’m still the second picture when I’m working on my own stuff after hours 😂

  • Dylan@lemdro.id
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    Can’t wait to leave after 10 hours of bad screen to go home to 10 hours of good screen.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Honestly, it’s the only real down side of WFH. Your computer happy place becomes a den of working and pain for the hours between 9 and 5.

      Everything else about working from home is amazing though.

  • The irony is that nowadays the monitors would be swapped. The “good PC” would have a CRT (because most CRTs nowadays are probably in enthusiast rigs), while the “bad PC” would have the common 1080p Dell IPS display.

    On an semi-related note, why are Dell’s IPS panel monitors so ridiculously common? VA and TN panels are a lot cheaper, so I’d think companies wanting to get the most bang for their buck would use those instead. Is it the fact that IPS panels have a decent horizontal viewing angle, so Mr. Micromanager can look over your shoulder and see what you’re doing more easily?

    • funkajunk@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      Where are you getting this information about CRTs from? I know they get used for old school emulation, but pretty sure for modern systems a high refresh rate and freesync/gsync is where it’s at.

      • Mossy Feathers (They/Them)@pawb.social
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 year ago

        People who are into older games tend to have a CRT + retro rig or digital to analog converter. A lot of older PC games legitimately look nicer on CRTs. Additionally, CRTs can have ludicrously high refresh rates and resolutions, don’t let the 4:3 aspect ratio fool you. High-end CRTs (specifically computer monitors, not TVs) tended to max out at 1600x1200 (vs 1920x1080), giving them a slightly larger vertical resolution at the cost of a lower horizontal resolution, with some going as high as 2048x1536 (comparable to 1440p (yes, 1440p, CRT computer monitors were mostly progressive scan, not interlaced like TVs)). Additionally, the refresh rates on later CRTs tended to start at 75hz (vs 60hz on LCDs), and could max out at 200hz on high-end monitors. You’d sacrifice resolution to do so, though I think you could mitigate some of that by using a BNC cable if your monitor supported it (though I doubt most rigs could run anything even close to 200fps without decreasing resolution). Finally, CRTs tend to have extremely low response times, very good color depth, and true blacks.

        That said, CRTs are heavy, fragile, and nowadays, expensive (before the pandemic you could get a high-end Sony Trinitron 20" PVM (professional video monitor) for like, $300-$400; shipping was more expensive than the monitor, nowadays you’re easily talking $1000 or more). Most LCD panels can beat CRTs in resolution and refresh rate nowadays (though even high-end LCD panels tend to struggle at beating CRT response time), and OLEDs outclass CRTs in almost every way.

        Edit: oh, another weakness of CRTs is that they can burn-in. That’s where the term originated. If you left an image on the screen too long, it’d burn into the display, causing it to persist even after the monitor was turned off and unplugged. Since no one’s making CRTs anymore, that means there’s a smaller and smaller pool of CRTs in good condition, which means they’ll get more expensive until someone decides it’s worth the money to start making the tubes again.

        Edit 2: that’s also why screensavers were a thing! Screensavers were there to stop you from accidently burning in your monitor. I wonder why they haven’t made a comeback with OLEDs.

    • moosetwin@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I bet it’s that intel already has a business relationship with a ton of companies and the inertia is keeping them common

    • w2tpmf@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Dell produced monitors is much larger numbers that most distributors like CDW etc will have in every warehouse in the country. This makes it much easier to standardize equipment across a large organization when you can always order the exact same SKU for several years in a row.