Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit
First: I’m not in any way intending to cast any negative light on the horrible shit the people suing went through.
But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.
If you really were serious about suing to force change, you’ve literally got:
X, who has reinstated the accounts of people posting CSAM
Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
Instagram/Facebook, which have much the same problem as X with slow or limited action on reported content
Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going ‘well, akshully’ at reports.
Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
I used to share an office with YouTube’s content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it’s worth, YT does take action on CSAM and other abusive materials. The problem is that it’s just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it’s not exactly easy to keep a department like that staffed (turns out you really can’t pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there’s pretty much always a consistent backlog of content to review.
First: I’m not in any way intending to cast any negative light on the horrible shit the people suing went through.
But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.
If you really were serious about suing to force change, you’ve literally got:
Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going ‘well, akshully’ at reports.
I used to share an office with YouTube’s content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it’s worth, YT does take action on CSAM and other abusive materials. The problem is that it’s just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it’s not exactly easy to keep a department like that staffed (turns out you really can’t pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there’s pretty much always a consistent backlog of content to review.
While this article talks about Facebook, specifically, it’s very similar to what I saw with YouTube’s team, as well: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona