A space for sharing and discussing news related to global current events, technology, and society.
69464 Members
We'll be adding more communities soon!
© 2020 Relevant Protocols Inc.
A space for sharing and discussing news related to global current events, technology, and society.
69464 Members
We'll be adding more communities soon!
© 2020 Relevant Protocols Inc.
Relevant
Hot
New
Spam
Relevant
Hot
New
Spam
2
1.2K
2
1.2K
What should be clear by now, is the extent to which these massive corporations are making up the rules of online speech as they go along. In the absence of any independent standards or accountability, public opinion has become an essential part of the process by which their moderation policies evolve.
What should be clear by now, is the extent to which these massive corporations are making up the rules of online speech as they go along. In the absence of any independent standards or accountability, public opinion has become an essential part of the process by which their moderation policies evolve.
The underlying problem of our platforms is not that they’re too conservative or too liberal, too dogmatic or too malleable. It’s that giant, for-profit tech companies, as currently constructed, are simply not suited to the task of deciding unilaterally what speech is acceptable and what isn’t.
The underlying problem of our platforms is not that they’re too conservative or too liberal, too dogmatic or too malleable. It’s that giant, for-profit tech companies, as currently constructed, are simply not suited to the task of deciding unilaterally what speech is acceptable and what isn’t.
Even when they actually do have policies that they’re trying to apply consistently, the lack of a transparent process undermines the public’s confidence in their decisions. From outside, it all looks like a black box. In a good legal system, decisions may be controversial, but at least the rationale is clearly laid out, and there’s a body of case law to serve as context. But when Facebook decides to de-amplify a doctored video of House Speaker Nancy Pelosi, or YouTube opts to demonetize Steven Crowder’s channel, there’s no way to check whether those decisions are consistent with the way they’ve interpreted their rules in the past, and there’s no clear, codified way to appeal those decisions.
Even when they actually do have policies that they’re trying to apply consistently, the lack of a transparent process undermines the public’s confidence in their decisions. From outside, it all looks like a black box. In a good legal system, decisions may be controversial, but at least the rationale is clearly laid out, and there’s a body of case law to serve as context. But when Facebook decides to de-amplify a doctored video of House Speaker Nancy Pelosi, or YouTube opts to demonetize Steven Crowder’s channel, there’s no way to check whether those decisions are consistent with the way they’ve interpreted their rules in the past, and there’s no clear, codified way to appeal those decisions.
Some low-ranking comments may have been hidden.
Some low-ranking comments may have been hidden.