• schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    edit-2
    17 days ago

    First: I’m not in any way intending to cast any negative light on the horrible shit the people suing went through.

    But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.

    If you really were serious about suing to force change, you’ve literally got:

    1. X, who has reinstated the accounts of people posting CSAM
    2. Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
    3. Instagram/Facebook, which have much the same problem as X with slow or limited action on reported content

    Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going ‘well, akshully’ at reports.

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      18
      ·
      17 days ago

      Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely

      I used to share an office with YouTube’s content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it’s worth, YT does take action on CSAM and other abusive materials. The problem is that it’s just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it’s not exactly easy to keep a department like that staffed (turns out you really can’t pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there’s pretty much always a consistent backlog of content to review.

      While this article talks about Facebook, specifically, it’s very similar to what I saw with YouTube’s team, as well: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

  • Max-P@lemmy.max-p.me
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    4
    ·
    17 days ago

    They’d get sued whether they do it or not really. If they don’t they get sued by those that want privacy invasive scanning. If they do, they’re gonna get sued when they inevitably end up landing someone in hot water because they took pictures of their naked child for the doctors.

    Protecting children is important but can’t come at the cost of violating everyone’s privacy and making you guilty unless proven innocent.

    Meanwhile, children just keep getting shot at school and nobody wants to do anything about it, but oh no, we can’t do anything about that because muh gun rights.

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    17 days ago

    I thought the way they intended to handle it was pretty reasonable, but the idea that there is an actual obligation to scan content is disgusting.

  • paraphrand@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    17 days ago

    “People like to joke about how we don’t listen to users/feedback. About how we just assert our vision and do things how we wish. Like our mouse. It drives people absolutely bonkers! But this time we listened to the pushback. And now they sue us?”