Alzrius
The EN World kitten
On the contrary, I'm not ignoring that deplatforming gets results in terms of silencing people and controlling the flow of information; that's how it "works." There are many unethical ways to prevent people from speaking and stop people from finding out things that you don't want them to know about, at least for a time. But we're talking about virtue, insofar as what the right thing to do is, and how to achieve a "better" (i.e. more just, more inclusive, more diverse, etc.) society, and by extension, world. In that regard, what you're proposing is counterproductive, as it asserts that you should try and stifle dissent to create a more open society.No, you're trying to ignore that deplatforming works, and the information goes directly against the idea that they will "always find another vector". You talk about the ethics of deplatforming, but there's plenty of ethical conundrums about platforming bad actors. I prefer to keep my platform relatively safe and welcoming, which would mean more moderation so that the most vulnerable feel welcome. You may differ and that's your choice. But the ethical questions cut both ways, and I feel pretty comfortable about where I stand on it.
While there may be certain ethical conundrums with allowing bad actors to say their piece, but those conundrums themselves speak to the correctness of allowing people to talk, since that's how you seek out answers for them. Otherwise, you're just saying, "ignore the conundrums, and never listen to anyone who doesn't agree with the party line." That's not a stance that's historically been adopted by people I'd want to be associated with.
Again, this treats odious ideas as though they were a contagion, rather than something which can be engaged with and demonstrated to be lacking in virtue. Bigotry (which while often unambiguous nevertheless has many gray areas where reasonable people can disagree what falls under that label and what doesn't) is fairly easy to knock down when you confront it and show how it lacks virtue. The idea that it needs to be hidden away lest people find out about it serves only to empower it, creating enclaves which reinforce the people in them, and serve as alternative outlets for people who, in the course of seeking answers, look in alternative venues.What do you think we're talking about? Because I'm talking about bigotry and such. When you're talking about engaging Nazis, I don't see a reason to platform their ideas and views. It's counterproductive and giving them big outlets to espouse those views is generally way more harmful because you give them a bullhorn to do so. Again, imagine if every time we had to talk about racial justice we had to debate a Nazi on-air. That's not constructive, that's actively destructive to the discussion because we are ceding easily-drawn boundaries as to what is acceptable and what isn't.
The idea that "if people hear their message, they'll be swayed" gives those ideas more power than they have, and certainly more than they deserve.
I don't know about "need," but doing so is the surest route to showing them why they shouldn't do that, as opposed to attacking them and making them harden their position.I mean, you can. We don't need to have a debate why slurs are unacceptable every time someone decides to use them, just as we don't need to engage them in a good faith discussion to convince them they are wrong. In fact, we know this doesn't work, which is why we have something called the backfire effect.
As for the backfire effect, if you follow the link that you posted, it says the following:
However, subsequent research has since failed to replicate findings supporting the backfire effect.
Which goes to show that this idea really is a phantom menace, one even more awful than the movie of the same name.

No, I think you're making a distinction where there isn't one; the example I posted earlier was very much with regard to a person who, by their own admission, approached people in settings where they felt comfortable, got to know them, and debated beliefs without it being acrimonious. People can and do change their minds, if they don't feel like they're being attacked, talked down to, etc. If the goal is to wipe out odious beliefs, then that's the way to go, since it works better than anything.Again, this is not a "discussion" tactic. It's a "deprogramming" tactic. These are very different and require very different approaches. The latter is not a workable approach to content and platform moderation, and really needs to come from outside those discussions first.
You're talking about individuals, not beliefs. Taking away a single person's platform only serves to galvanize those who see someone being punished for having a dissenting opinion, effectively turning them into a martyr. That's the wrong way to go about it, since it only serves to reiterate a siege mentality. And believe it or not, there is evidence to the contrary even where Alex Jones is concerned:Also it doesn't increase the reach of odious people, as we have dozens of cases against this: removing people from platforms decreases their reach almost every time. It's only when you replatform them that they can regain their reach. Alex Jones and the other examples in the post I link go directly against what you say and you can't provide any evidence to counter.
As that article notes: "Initially, a round of media coverage touted flagging traffic to Jones' websites as evidence that "deplatforming works." However, revelations from Jones' defamation trials may point to the existence of a rarified class of extreme internet personalities who are better shielded from efforts to stem the reach of their content."
You do realize that the subtitle to the link you posted says "The research suggests that banning certain users on Twitter and Reddit does help cut down hateful content – but it raises other concerns." That's after that same outlet previously pointed out that deplatforming wouldn't stop that same group: "Even if the social network's new policies work perfectly, Q followers can still camouflage their activity or move to other platforms."I mean, that's not what "engagement" is. Again, deprogramming is very different than "engagement" and "discussion". Deplatforming works, and I've provided data that shows that it cuts down on toxicity. I can continue to show data in this regard, but at this point I think you're not really engaging with the premise.
Which is why this isn't deprogramming, and talking to people with beliefs that you disagree with isn't deprogramming, and what Daryl Davis did isn't deprogramming. You introduced that element into the conversation, and so pointing out how it's not relevant to what's happening here doesn't serve to advance any particular point in that regard. Talking to people is how you start to bring them back from the edge. That's what all of the articles I've posted have said, and what even some of the articles you've posted have said, such as when you posted a link to the backfire effect, only for it to say that the effect isn't supported by research.It does. It really, really does. Like right now we are in a discussion, and I'm engaging you. I'm not trying to deprogram you. That is what you think engagement is, but it is not: deprogramming something is not just talking on a subject, but a long process of pulling someone away from the edge. Your view of how people change their minds does not really reflect what we know about how people react to their beliefs being changed, and it even ignores the article you posted yourself, where the person did not debate them on topics immediately but found inroads to form a relationship to bring down their belief system. That is not something that you do discussing a topic on a messageboard; in fact, the lack of personal investment makes it almost completely alien in that regard.
Talking to people, without it being acrimonious or confrontative, is how you get them to change their minds about something. It's not the whole of it, and it isn't immediate, but it's the most surefire way to get them to stop subscribing to an odious belief. Deplatforming just reinforces it, which is what you don't want to do, no matter how good it might feel at the time.
Not in this case, they aren't. If someone is actually shooting at you, then yes, you shoot back. But we're talking about the merit of debating ideas, which are won with words.Sure, but Lincoln didn't win the war with just words, and similarly Lincoln did not compromise on the 13th Amendment. Hell, Lincoln suspended habeas corpus in some areas. The ideal and the reality are very different things.
They're not "our" discussions. They belong to everyone, and others have just as much right to decide what they'll brook as we do. Talking down to someone else about how they're not welcome to share what they believe when everyone else is doesn't serve to change their minds, and runs the risk of making them look sympathetic, even if you then label them as a bigot or some other pejorative. Having an established ground for discourse only works if everyone agrees to the establishment, and if people don't they shouldn't be shown the door for questioning something; that way lies orthodoxy.And we also don't need to brook bad ideas in our discussions. We don't need to debate Nazis, racists, xenophobes, climate denialists, etc, every time they show their faces. It would grind every discussion we have to a halt. Having a common, established ground for discourse allows us to have fruitful discussions instead of endlessly litigating meaningless tangents.