0 0
Instagram, Child Endangerment, and the Problems with Section 230 - Metaphors Are Lies

Instagram, Child Endangerment, and the Problems with Section 230

Read Time:2 Minute, 7 Second

The Wall Street Journal has a disturbing report on how Instagram’s algorithm shows adults who follow children porn related material, pictures of teenager and children, and images and videos that overtly sexualize children. I will let you read the entire thing, but this happens because how section 230 has been allowed to be interpreted incentives this behavior.

Section 230 is a portion of a law designed to do two things: prevent internet companies from being liable for things said in their comments and to allow them to moderate their sites without fear. However, it has been interpreted as protecting the decisions of companies to favor one kind of content over another via their algorithms. In other words, a limited set of protections has been expanded upon in such a way as to ensure that companies cannot be held liable for their actions, as long as those actions involve the internet and an algorithm.

This, obviously, leads directly to the Instagram algorithm showing adults sexualized pictures of children. Instagram makes money when people are on the platform (the investigation showed that adds appeared next to these pictures) and so their algorithms are designed and tuned to keep people engaged. That results in what the Journal found — Instagram showing damaging content and incentivizing both its consumption and creation (go read the article. What these people where shown was often much more than just the accounts of teenagers, for example, and very often disturbing.). Since Instagram’s algorithms have been traditionally seen as internet speech, they are fully protected from the consequences of their actions. And thus fully incentivized to continue their actions for as long as it makes them money.

Algorithms aren’t speech, though. They are products, like toasters and lawnmowers. If you made a lawnmower that constantly chopped off people’s hands when starting, no one sane would argue that you weren’t liable for the lost hands. But algorithms are assumed to be magically appearing things that no human being has created or controlled. It is nonsense. the Internet is not special and deserves no special treatment. By pretending it does, we do real harm to the most vulnerable people in society, all so a handful of atrociously rich people can get a little richer.

We can do so much better. Treat algorithms like the products they are, with all the associated liabilities and responsibilities, and those incentives disappear. Stop pretending that a product is speech and start living up to the obligations to protect people from the harms of unscrupulous businesses.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.