alt text

Caption

Web dev: What browser is visiting the page?

User agent string:

A screenshot of a browser. The URL bar reads firefox://settings, a button on the URL bar is labelled Netscape, a popup from the button reads: “You’re viewing a secure Opera page”, and the web page title reads “Chrome settings”.

  • apprehentice@lemmy.enchanted.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    13 days ago

    Functionally useless. With the web standardized, we shouldn’t need user agents anyway. It would be more beneficial to ask “do you support X, Y, and Z?”

    • elrik@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      13 days ago

      User agents are useful for checking if the request was made by a (legitimate self-identifying) bot, such as Googlebot.

      It could also be used in some specific scenarios where you control the client and want to easily identify your client traffic in request logs.

      Or maybe you offer a download on your site and you want to reorder your list to highlight the most likely correct binary for the platform in the user agent.

      There are plenty of reasonable uses for user agent that have nothing to do with feature detection.

        • elrik@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          13 days ago

          That’s correct, it is just plain text and it can easily be spoofed. You should never perform an auth check of any kind with the user agent.

          In the above examples, it wouldn’t really matter if someone spoofed the header as there generally isn’t a benefit to the malicious agent.

          Where some sites get into trouble though is if they have an implicit auth check using user agents. An example could be a paywalled recipe site. They want the recipe to be indexed by Google. If I spoof my user agent to be Googlebot, I’ll get to view the recipe content they want indexed, bypassing the paywall.

          But, an example of a more reasonable use for checking user agent strings for bots might be regional redirects. If a new user comes to my site, maybe I want to redirect to a localized version at a different URL based on their country. However, I probably don’t want to do that if the agent is a bot, since the bot might be indexing a given URL from anywhere. If someone spoofed their user agent and they aren’t redirected, no big deal.

        • dan@upvote.au
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          13 days ago

          Most developers just write their own feature checks (a lot of detections are just a single line of code) or use a library that polyfills the feature if it’s missing.

          The person you’re replying to is right, though. Modernizr popularized this approach. It predates npm, and npm still isn’t their main distribution method, so the npm download numbers don’t mean anything.

    • Quacksalber@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      13 days ago

      Youtube currently (for weeks now) does not work on Firefox, if you don’t use a Firefox user agent. Google doing sketchy things again.