What a lot of people mean by "Harmful content" isn't opinions and discourse on whether something is good or bad. Harmful content is disinformation about what something is.
It's a concept that can be abused for certain, and I suppose that makes it dangerous.
Unfortunately, there's also dangers in not intervening.
Every space in which valuable discourse takes place has to take some form of order including restraint seriously, otherwise the bad drives out the good. And if you look at the places we take discourse most seriously -- institutions of learning, courtrooms, legislative bodies -- because valued outcomes rely on how robust the discourse is, more order and restraint requirements seem to become the rule. And yet there's also lots of thought put into how to allow as much input as possible, even outright adversarial input.
It's possible that at a certain scale, social media systems have to move from a laissez faire approach to some sort of similar balance. It'd probably be best if each took responsibility to decide what that is. Twitter's approach doesn't have to be the same as FBs, but everybody has some responsibility to try and make discourse as good as they can (at least, if the reason they value freedom of speech is because of the value of discourse, rather than as a personal indulgence).
> What ever FB censures, half the country will be mad at it. FB can't win here.
People across the political spectrum are already mad at FB, and some have decided on a posture that lets them push the drama that FB is biased against them no matter what FB's actual policy is. FB has nothing to lose by making its best faith effort. And I'd like to think that no matter what someone's general political sensibilities are they can find a way to advocate for them under conditions where order and some degree of accountability for truthfulness is required of them.
> ... and some have decided on a posture that lets them push the drama that FB is biased against them no matter what FB's actual policy is.
"The only way to find the limits of what's possible is to push past them to the impossible." I forget who said it, but some people will push what they can get away with as far as possible, and then try to go farther. No matter where FB sets the line, they'll try to go past it, because they care about winning, not about civil discourse or reasonable standards or fairness or anything like that. (Rules are for the other side.) The fact that they can then play the martyr that FB is oppressing is a side benefit.
> Harmful content is disinformation about what something is.
This is the crux of the issue. Legacy institutions are used to determining what is true and therefore what is inside the bounds of allowable opinion.
This was their domain for decades but it can now be seriously challenged by relatively few people due to the amplification in sharing of human thought through the internet.
Their degradation was the only outcome likely from networking so many individuals together.
'Disinformation', even if you are inclined to trust legacy institutions, should be viewed with a healthy degree of skepticism. Even if we grant that legacy institutions always had the closest approximation reality over the past several decades (and this is generous).
Everyone should now be able to recognize these players for what they are: self interested actors bending public perception of reality to convenient conclusions for their own benefit. Given this, it is not surprising that people are more motivated than ever before to uncover past warpings of past narratives. They are searching for their historical moment.
Now we can grant that it may be best to have these legacy institutions looking out for our interests rather than some unknowable future state of governance. But checking their past conduct is a textbook littered with dark chapters that is now being viewed with the understanding that these are the rosier sections. Not a good look. As we move forward in time the failings of existing governments will only become more apparent not less; and it will continue to make less and less sense to believe in them.
The alternative to this would be to either restrict information flow to some pre-digital age or to perpetuate totalitarian regimes in charge of protecting their own existence by manipulating the masses into passivity.
And here is the second crux, it is only the legacy institutions from the 20th century who are perpetuating the conflict. Were the doubters given license to control their own territory, we have every indication that they would leave the past well enough alone and forge ahead. But the 20th century just cannot let go its dreams of total domination of thought and reality.
> And I'd like to think that no matter what someone's general political sensibilities are they can find a way to advocate for them under conditions where order and some degree of accountability for truthfulness is required of them.
The problem is that the sides don't even agree on the same facts anymore. For example, some think there was election fraud, and some don't. Some believe there is actual evidence of election fraud, and some don't. And neither side is without some evidence. For example, there have been election audits and those seem to show there was no fraud, so that's evidence. But at the same time, there are other auditors that show very convincing evidence that there was at least some additional voter fraud beyond what was detected in the audits.
If I recall correctly, there were some science experiments in the past that sort of "wiggled around" the truth for a period of decades as they got closer to the truth. With current events, it seems like it's a lot more like that than to just look outside the window and say it's raining or not.
For example, look at how certain views were banned on YouTube... until the CDC/FDA/etc. itself reversed its position and now those are the mainstream or at least acceptable views.
Isn't it usually the case that a difference in opinion ultimately is the result of a difference of belief about facts? And if that's the case, if a biased moderator of any platform can decide what the facts are, that is effectively the same as deciding what opinions are valid. And although there are some clear cases of ridiculous beliefs that we can think about (flat earth, etc.), very quickly we get into murky territory.
I mean, just imagine if Facebook was established in the deep south, and suppressed all non-Christian opinions as counter-factual? "The evidence is clear" they would say, "here in the Holy Bible".
> The problem is that the sides don't even agree on the same facts anymore.
That's certainly going to be the case at times, but that's also no reason to give up. Where substantial outcomes rely on facts, good institutions build ways of addressing contention about the facts themselves into to the order they impose on discourse. It can't guarantee a correct outcome -- as you say, sometimes you have to wiggle around the truth for a while first -- but it's better than the nihilism of failing to engage the problem altogether.
> some think there was election fraud, and some don't.
The institutions where questions of election fraud were mediated were accessible to both sides in equal measure -- arguably biased toward the side that lost, given how the privilege of selecting judicial appointments has shaken out over the last two decades and who held the resources/power available to federal and various state executive authorities last fall. And they seem to have determined that evidence of systemic outcome-changing fraud was thin indeed.
I suppose it's possible to imagine a different outcome from a similarly robust process over time, but if there are "other auditors that show very convincing evidence" of outcome-changing fraud, it would be interesting hear what that specifically is, and why that evidence didn't make its way into the venues that actually mattered in a moment when the outgoing administration had considerable advantages.
> But at the same time, there are other auditors that show very convincing evidence that there was at least some additional voter fraud beyond what was detected in the audits.
What evidence, exactly? IIRC, there were some professional-looking mathy analyses that claimed to find it, but they all had glaring methodological errors.
I think there are cases where "the sides don't even agree on the same facts anymore" and both have some claim to truth, but it's not on election fraud or the election results.
It's a concept that can be abused for certain, and I suppose that makes it dangerous.
Unfortunately, there's also dangers in not intervening.
Every space in which valuable discourse takes place has to take some form of order including restraint seriously, otherwise the bad drives out the good. And if you look at the places we take discourse most seriously -- institutions of learning, courtrooms, legislative bodies -- because valued outcomes rely on how robust the discourse is, more order and restraint requirements seem to become the rule. And yet there's also lots of thought put into how to allow as much input as possible, even outright adversarial input.
It's possible that at a certain scale, social media systems have to move from a laissez faire approach to some sort of similar balance. It'd probably be best if each took responsibility to decide what that is. Twitter's approach doesn't have to be the same as FBs, but everybody has some responsibility to try and make discourse as good as they can (at least, if the reason they value freedom of speech is because of the value of discourse, rather than as a personal indulgence).
> What ever FB censures, half the country will be mad at it. FB can't win here.
People across the political spectrum are already mad at FB, and some have decided on a posture that lets them push the drama that FB is biased against them no matter what FB's actual policy is. FB has nothing to lose by making its best faith effort. And I'd like to think that no matter what someone's general political sensibilities are they can find a way to advocate for them under conditions where order and some degree of accountability for truthfulness is required of them.