A reported Free Download Manager supply chain attack redirected Linux users to a malicious Debian package repository that installed information-stealing malware.

The malware used in this campaign establishes a reverse shell to a C2 server and installs a Bash stealer that collects user data and account credentials.

Kaspersky discovered the potential supply chain compromise case while investigating suspicious domains, finding that the campaign has been underway for over three years.

  • TrustingZebra@lemmy.one
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    1 year ago

    It’s still my favorite download manager on Windows. It often downloads file significantly faster than the download manager built into browsers. Luckily I never installed it on Linux, since I have a habit of only installing from package managers.

    Do you know of a good download manager for Linux?

    • FredericChopin_@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      How much faster are we talking?

      I’ve honestly never looked at my downloads and though huh you should be quicker, well maybe in 90’s.

      • TrustingZebra@lemmy.one
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        FDM does some clever things to boost download speeds. It splits up a download into different chuncks, and somehow downloads them concurrently. It makes a big difference for large files (for example, Linux ISOs).

        • somedaysoon@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          1 year ago

          It only makes a difference if the server is capping the speed per connection. If it’s not then it will not make a difference.

          • TrustingZebra@lemmy.one
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            I guess many servers are capping speeds them. Makes sense since I almost never see downloads actually take advantage of my Gigabit internet speeds.

            • somedaysoon@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              edit-2
              1 year ago

              It’s interesting to me people still download things in that fashion. What are you downloading?

              I occasionally download something from a web server, but not enough to care about using a download manager that might make it marginally faster. Most larger files I’m downloading are either TV shows and movies from torrents and usenet, or games on steam. All of which will easily saturate a 1Gbps connection.

        • FredericChopin_@feddit.uk
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          1 year ago

          Im curious as to how it would achieve that?

          It can’t split a file before it has the file. And all downloads are split up. They’re called packets.

          Not saying it doesn’t do it, just wondering how.

          • everett@lemmy.ml
            link
            fedilink
            arrow-up
            13
            ·
            1 year ago

            It could make multiple requests to the server, asking each request to resume starting at a certain byte.

              • drspod@lemmy.mlOP
                link
                fedilink
                arrow-up
                18
                ·
                1 year ago

                The key thing to know is that a client can do an HTTP HEAD request to get just the Content-Length of the file, and then perform GET requests with the Range request header to fetch a specific chunk of a file.

                This mechanism was introduced in HTTP 1.1 (byte-serving).

      • arglebargle@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        just grabbed a gig file - it would take about 8 minutes with a standard download in Firefox. Use a manager or axel and it will be 30 seconds. Then again speed isnt everything, its also nice to be able to have auto retry and completion.

    • Xirup@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      JDownloader, XDM, FileCentipede (this one is the closest to IDM, although it uses closed source libraries), kGet, etc.

    • arglebargle@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      axel. use axel -n8 to make 8 connections/segments which it will assemble when it is done

    • TheAnonymouseJoker@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      Xtreme Download Manager. Very similar to Internet Download Manager on Windows.

      Also, use either of those two. FDM is very meh.