• MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    True, although once per hour would still be a lot of data.

    For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if ran hourly.

    • Norah - She/They@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Once every 6hrs would only be 180GB. A script that does it every six hours, but then increases the frequency if it goes below a certain threshold, could work well. I guess it all depends on how accurate you need the data to be.